Meta Removes 635,000 Harmful Accounts, Rolls Out New Teen Safety Features

Written by on July 22, 2025

Meta has introduced new safety tools to protect teens on Instagram and Facebook amid increasing criticism and lawsuits over youth safety. On July 23, 2025, the company reported removing 635,000 accounts linked to sexually inappropriate behavior toward underage users—135,000 for direct comments and 500,000 for other violations. The new features allow teens to quickly block and report suspicious accounts and receive safety prompts in private messages. Meta also continues testing AI to verify user ages, automatically switching suspected minors to teen-mode accounts, which restrict messaging and default to private settings. These changes follow lawsuits from multiple U.S. states accusing Meta of deliberately addicting children to its platforms. The tech giant emphasized its efforts to lead in safety innovation, but critics say stronger enforcement and external oversight are needed to ensure children’s wellbeing in digital spaces.

Advertisements

Source: AP News


Continue reading

Current track

Title

Artist