Tech
Meta kicks as EU probes Facebook, Instagram over child safety concerns
Meta (formerly the Facebook company) has to the European Union (EU) investigation of child safety concerns and scrutiny of its algorithmic systems.
The EU had launched further investigations against Facebook and Instagram, claiming that the companies’ poor online child safety practices contravene the strict digital standards the bloc has in place for social media companies.
Under the terms of the EU’s Digital Services Act, this represents the start of a new phase of investigation for Facebook and Instagram’s parent company, Meta Platforms.
While reacting to the probe by the EU, Meta said one of its main goals is to make sure young people are secure when using its services. After decades of research, the business has produced almost fifty online safety features to keep kids safe.
The tech company also stated that it is prepared to talk about its plans with the European Commission, emphasizing that child protection is a problem that the whole sector faces.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” a Meta spokesperson told CNBC by email.
“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
Join the conversation
Support Ripples Nigeria, hold up solutions journalism
Balanced, fearless journalism driven by data comes at huge financial costs.
As a media platform, we hold leadership accountable and will not trade the right to press freedom and free speech for a piece of cake.
If you like what we do, and are ready to uphold solutions journalism, kindly donate to the Ripples Nigeria cause.
Your support would help to ensure that citizens and institutions continue to have free access to credible and reliable information for societal development.