European Commission investigates impact of YouTube, TikTok and Snapchat on mental health

The European Commission has launched an investigation into social media networks’ practices in response to concerns that vulnerable people might be consuming fake news and content that encourages self-harm.

TikTok, YouTube, and Snapchat will have to answer European Commission inquiries on how their video recommendation algorithms work, the EU executive said, Euronews reported

Under the EU’s Digital Services Act (DSA), the EU is investigating why vulnerable people are receiving fake news about elections or content that glorifies eating disorders, depression, and drug abuse, the report said.

The European Commission is interested in the impact of features like autoplay and limitless scrolling, as well as the protections that platforms have in place to prevent the spread of dangerous information. Auto-play and infinite scrolling features may contribute to excessive video viewing.

According to the DSA, platforms that have more than 45 million monthly active users must meet strict transparency requirements. The request also addresses the measures companies take to protect minors and reduce the spread of harmful content.

The European Commission queries tech giants about the workings of their algorithms, the parameters that shape the recommendation feed, and whether the platforms allow users to restrict specific types of content.

In particular, they are also interested in security measures that prevent the spread of illegal or dangerous information.

If companies fail to provide the requested information or provide false data to the European Commission, they face fines under EU law. 

However, this request is only an initial step in the investigation process, and the European Commission will decide on possible further actions only after analyzing the responses.

According to one EU official, this investigation should be a “wake-up call” for social media platforms that need to reconsider their approaches, such as allowing users to hide certain types of videos or limiting viewing time.

This request does not apply to Meta, the parent company of Facebook and Instagram.

However, Meta’s Facebook and Instagram are subject to two investigations under the Digital Services Act to check whether their measures to protect minors online are compliant with the rules. In May, the European Commission began an investigation into whether those platforms’ interfaces exploit children’s inexperience to incite addictive behavior. 

TikTok, YouTube, and Snapchat must provide the requested information by November 15, after which the European Commission will decide whether there are grounds for further formal procedures against the platforms.

Earlier this spring, the President of the European Commission hinted at the possibility of banning TikTok in the EU. 

In early July, the European Commission accused Meta of violating the rules of the Digital Markets Act due to its “pay or play” advertising model, which could result in a multi-billion dollar fine. 

Read all articles by Insight News Media on Google News, subscribe and follow.
Scroll to Top