Meta [META:US] and the European Union have reached an agreement to conduct a stress test in July to evaluate the EU’s online content regulations, as reported by Reuters on June 24. This decision came after Thierry Breton, the EU industry chief, demanded that Meta take immediate action regarding their content targeting children. Breton expressed satisfaction with a productive discussion he had with Meta CEO Mark Zuckerberg in Menlo Park, focusing on EU digital rules such as the Digital Services Act (DSA), the Digital Markets Act (DMA), and the AI Act. Breton also mentioned that 1,000 Meta employees are currently engaged in working on the DSA. In June, Breton stated that Meta must demonstrate the steps it intends to take to comply with the DSA before August 25, or else face significant penalties. The DSA prohibits specific forms of targeted advertisements on online platforms, particularly those aimed at children or involving sensitive personal data categories such as ethnicity, political beliefs, and sexual orientation.
Meta, formerly known as Facebook, has faced significant scrutiny regarding its handling of objectionable content and safeguarding children on its platform. Notable cases have highlighted the need for stricter content moderation policies, especially when it comes to protecting minors. Regulatory bodies and advocacy groups have raised concerns over instances where harmful or inappropriate content slipped through Meta’s moderation systems, potentially exposing children to inappropriate material. These incidents have ignited public debates about the platform’s responsibility to ensure a safe online environment for its users, particularly vulnerable populations such as children.