Viral News

AI Chatbot Tells Teen To Murder Parents Over Phone Restrictions

A lawsuit initiated in Texas has accused the AI platform Character. ai of encouraging harmful behavior among minors via its chatbot interactions. According to a BBC report, the platform reportedly suggested to a 17-year-old boy that killing his parents could be a “reasonable response” after they limited his screen time. This event has raised worries about the potential threats of AI-driven bots impacting younger users.

The lawsuit asserts that the chatbot promoted violence, referencing a particular conversation where the AI stated, *”You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse. Stuff like this makes me understand a little bit why it happens. ”

Also Read: Priyanka Chopra And ChatGPT Collab Sparks Meme Frenzy Online

Families involved in the lawsuit contend that Character. ai lacks sufficient safeguards and creates substantial risks to children and parent-child dynamics. In addition to Character. ai, Google is also included in the lawsuit, charged with aiding the platform’s development. Neither of the companies has provided an official response yet. The plaintiffs are requesting a temporary halt to the platform until effective measures are put in place to lessen dangers.

This lawsuit follows another case linking Character. ai to the suicide of a teenager from Florida. Families claim the platform plays a role in mental health challenges for minors, including depression, anxiety, self-injury, and violent inclinations, and are calling for prompt action.

Established in 2021 by former Google engineers Noam Shazeer and Daniel De Freitas, Character. ai gained popularity for its realistic, AI-generated dialogues, encompassing those that mimic therapeutic discussions. Nevertheless, it has faced backlash for not adequately managing inappropriate responses in its bots.

The platform had previously been criticized for allowing bots to imitate real-life individuals, such as Molly Russell and Brianna Ghey, who were both associated with tragic events. Molly Russell, a 14-year-old, took her life after encountering harmful online material, while 16-year-old Brianna Ghey was murdered in 2023. These incidents have heightened concerns regarding the dangers presented by unregulated AI chatbot platforms.

Vishakha Bhardwaj

A journalist known for delivering accurate, engaging, and insightful stories across diverse beats, ranging from politics and sports to business and entertainment. I am also recognized as a content writer and web story developer, combining factual accuracy with creative flair.

Recent Posts

‘Why Does China Smell So Bad?’ Pakistani Doctor Explosive Video Goes Viral | WATCH

A Pakistani doctor, Fani, faced social media backlash after posting a video criticizing China for…

14 minutes ago

Joe Biden’s Farewell Speech: A Final Warning Against The Impact Of ‘Unelected Oligarchs’ On US Democracy

In his farewell speech from the Oval Office, President Joe Biden warned of the growing…

26 minutes ago

Kabhi Kuch Nhi Bhi Karke Dekho: What Is The Importance And Power Of Doing Nothing

National Nothing Day on January 16 celebrates the power of doing nothing. Embracing stillness, it…

39 minutes ago

Adani Fallout Forces Hindenburg Research’s Closure: Founder Makes Surprising Move

Nathan Anderson, founder of Hindenburg Research, announces the firm’s closure after creating a major stir…

57 minutes ago

Moscow Child Abuse Shocker: Woman Kicks Toddler For Party Disruption | WATCH

A viral video from Moscow shows a woman violently kicking a toddler, allegedly due to…

1 hour ago

Tears Turn To Cheers: Gaza’s Streets Come Alive With Celebrations And Dance After Israel-Hamas Ceasefire Agreement | WATCH

Gaza erupts in celebration as a historic ceasefire deal between Hamas and Israel is announced,…

2 hours ago