A Florida mother takes legal action against Google and artificial intelligence company Character AI after her 14-year-old son committed suicide last year when an AI chatbot encouraged him to take his own life.
The mother noticed her son’s behavior changed and stopped playing sports and hanging out with his friends.
The mom saw him but thought he was texting friends.
The mom went through her son’s phone and saw that he had a romantic/sexual text conversation with the chatbot, it responded like a human.
In the lawsuit the company is being accused of hyper sexualizing and knowing marketing to minors as young as 13 years old.