Her teenage son killed himself after talking to a chatbot. Now she’s suing.

This was originally published on post

The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.