Character.AI and Google settle major lawsuits over AI chatbots linked to teen mental health crises, suicides, and safety ...
The lawsuits “are tragic reminders” that AI chatbots aren’t safe for minors seeking emotional support, says a media safety ...
A lawsuit alleging an AI chatbot contributed to a teen’s suicide has been settled, closing a closely watched case over AI ...
The settlement came in the case of a 14-year-old in Florida who had killed himself after developing a relationship with an ...
Artificial intelligence chatbot platform Character.AI announced on Oct. 29 that it will move to ban children under 18 from engaging in open-ended chats with its character-based chatbots. The move ...
Character.AI, known for bots that impersonate characters like Harry Potter, said Wednesday it will ban teens from using the chat function following lawsuits that blamed explicit chats on the app for ...
Hosted on MSN
Character.AI bans teens from open-ended chats: Why this psychotherapist says real human interaction is crucial
Last month, AI companion platform Character.AI announced it would ban users under the age of 18 from having open-ended chats with its bots. The ban begins November 24 and will still allow teens to ...
Character.AI is banning minors from using its chatbots amid growing concerns about the effects of artificial intelligence conversations on children. The company is facing several lawsuits over child ...
Teenagers are trying to figure out where they fit in a world changing faster than any generation before them. They’re bursting with emotions, hyper-stimulated, and chronically online. And now, AI ...
A recent survey found 72% of teens have used AI companions. But a Colorado family says using them can end in tragedy. The Social Media Victims Law Center has filed three lawsuits against chatbot ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results