Meta strengthens child safety measures in response to increasing scrutiny

Meta announced on Friday that it is updating and expanding its child safety features in response to increased scrutiny over allegations of sexual abuse content involving children on its platform. The company said that, in addition to developing technology to combat this issue, it has hired specialists dedicated to online child safety and is sharing information with industry peers and law enforcement.

In a statement, the company emphasized the importance of working together to stop predators and prevent child exploitation, stating that predators do not limit their attempts to harm children to online spaces.

The Wall Street Journal recently published detailed reports on how Instagram and Facebook display inappropriate and sexual child-related content to users. A June report detailed how Instagram connects a network of accounts buying and selling child sexual abuse material (CSAM), guiding them to each other via its recommendations algorithm. A follow-up investigation showed that the issue extends to Facebook Groups, with an ecosystem of pedophile accounts and groups, some with as many as 800,000 members.

Meta outlined specific actions being taken to address these issues, including preventing potentially suspicious adults from following one another on Instagram and using technology to better find and address certain Groups, Pages, and Profiles on Facebook.

The company also noted the implementation of a new automated enforcement effort, resulting in five times as many deletions of Instagram Lives that contained adult nudity and sexual activity. Additionally, they reported actioning over 4 million Reels per month, globally, for violating their policies.

Meta acknowledged the seriousness of the recent allegations and emphasized that it has created a task force to review policies, examine technology and enforcement systems, and make changes that strengthen protections for young people and remove the networks predators use to connect with each other, banning them in the process.

Related Post