The Cross Heads

Character.AI steps up teen safety after bots allegedly caused suicide, self-harm

Following a pair of lawsuits alleging that chatbots caused a teen boy’s suicide, groomed a 9-year-old girl, and caused a vulnerable teen to self-harm, Character.AI (C.AI) has announced a separate model just for teens, ages 13 and up, that’s supposed to make their experiences with bots safer. In a blog, C.AI said it took a […]

December 12, 2024 | Artificial Intelligence, Character.AI, chatbots, child safety, Policy, suicide | No comments

Chatbots urged teen to self-harm, suggested murdering parents, lawsuit says

A loss could lead to heavy fines for Character Technologies and possibly Google, as the families have asked for punitive damages. They also seek money to cover their families’ past and future medical expenses, mental pain and suffering, impairment to perform everyday activities, and loss of enjoyment of life. C.AI bots accused of grooming, inciting […]

December 10, 2024 | Artificial Intelligence, Character.AI, chatbots, child safety, companion bots, generative ai, Google, Policy | No comments

January 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031