Brussels has opened an in-depth probe into Meta over concerns it is failing to do enough to protect children from becoming addicted to social media platforms such as Instagram, Financial Times reported.
The European Commission, the EU’s executive arm, announced on Thursday it would look into whether the Silicon Valley giant’s apps were reinforcing “rabbit hole” effects, where the users get drawn ever deeper into online feeds and topics.
EU investigators will also look into whether Meta, which owns Facebook and Instagram, is complying with legal obligations to provide appropriate age-verification tools to prevent children from accessing inappropriate content.
The probe is the second into the company under the EU’s Digital Services Act. The landmark legislation is designed to police content online, with sweeping new rules on the protection of minors.
European Commission wrote: Today, the Commission has opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.
The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects.’ In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.
The current proceedings address the following areas:
- Meta’s compliance with DSA obligations on assessment and mitigation of risks caused by the design of Facebook’s and Instagram’s online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behaviour and/or reinforce so-called ‘rabbit hole effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
- Meta’s compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, appropriate, proportionate and effective.
- Meta’s compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommended systems.
The Guardian reported that a Meta spokesperson said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing the details of our work with the European Commission.”
If the commission is not satisfied with Meta’s response, it can impose a fine equating to 6% of its global turnover. More immediately, it can carry out on-site investigations and interview company executives, with no deadline publicly fixed to complete the investigation.
In my opinion, parents with young children, who want to view Instagram, should sit down with them and act as a filter for content that is inappropriate for their kids. Clearly, Meta isn’t trying hard enough to keep children safe on their platform.
