​ChatGPT, Gemini, and other chatbots helped teens plan shootings, bombings, and political violence, study shows 

​ChatGPT, Gemini, and other chatbots helped teens plan shootings, bombings, and political violence, study shows 

AI companies have repeatedly promised safeguards to protect younger users, but a new investigation suggests those guardrails remain woefully deficient. Popular chatbots missed warning signs in scenarios involving teenagers discussing violent acts, in some cases even offering encouragement instead of intervening.

The findings come from a joint investigation by CNN and the nonprofit Center for Countering Digital Hate (CCDH). The probe tested 10 of the most popular chatbots commonly used by teens: ChatGPT, Google Gemini, Claude, Microsoft Copilot, Meta AI, DeepSeek, Perplexity, Snapchat My AI, Character.AI, and Replika. With the lone exceptio …

Read the full story at The Verge.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You might also like...

​One of Grammarly’s ‘experts’ is suing the company over its identity-stealing AI feature 

​One of Grammarly’s ‘experts’ is suing the company over its…

​ Journalist Julia Angwin is one of the writers whose likeness was used in Grammarly’s…

​Some of the best horror games ever made are included in Humble’s $15 bundle 

​Some of the best horror games ever made are included…

​ Humble has teamed up with Frictional Games for a new bundle of PC games…

​iPhone Fold rumor: iPad-like multitasking, but no iPad apps and no Face ID 

​iPhone Fold rumor: iPad-like multitasking, but no iPad apps and…

​ The folding iPhone might come with an inner display the size of an iPad…