Two new pieces of legislation will alter how major search engines and social networks operate in Europe, raising scrutiny of how they control information and interact with consumers.
When two new acts published in the EU Official Gazette, the Digital Services Act and the Digital Markets Act, enter into force, online service providers, from hosting service providers to search engines like Google or social media like Meta and Twitter, will need to change the way they operate in the European market.
The first will handle how providers control the information released through their networks. The second will govern their market behavior and relationships with competitors, consumers, and businesses that operate through their platforms.
The regulations had been in the process for a while, and the final versions were issued on October 27, 2022. The implementation begins gradually this year and will last 12 months. Companies are given time to adjust to the new requirements.
The search engine services covered by these acts significantly impact our way of life. Because everything is digitalized, there are significant changes in how we interact, work, purchase, study, become informed and entertained, and, most significantly, enter the public sphere to influence political and social processes.
The legislation will primarily apply to enterprises that provide online services, such as search engines and social media platforms. Still, laws will differ for the most significant search engines and platforms. According to the acts, a search engine and platform will be regarded as “extremely large” if it has more than 45 million active monthly users, 10% of the 450 million EU market.
The advantages of internet services are undeniable due to the opportunities they provide. Still, they can also harm economies and communities due to the possibility of malicious manipulation of algorithms for disinformation or illegal trading.
One of the new measures’ principal purposes is to improve online security and to curb the dissemination of so-called “illegal content,” such as harmful disinformation and hate speech.
All providers of digital services, regardless of category, will be required to meet minimum requirements such as being transparent in information dissemination, respecting their users’ fundamental rights, cooperating with relevant national institutions, and providing a point of contact and an official legal representative.
The European Commission will have direct oversight of the major internet platforms and search engines used by more than 10% of the EU’s total population. The major online platforms and search engines will be exposed to the most rigorous public scrutiny to ensure that the regulations are followed.
These platforms pose the greatest danger of spreading disinformation, hate speech, and other harmful or unlawful content, especially in light of Russian disinformation campaigns.
Platforms will not be required to examine the content before publishing it. They will not be held responsible for what their users share. Their primary responsibility will be to respond swiftly if illegal content is reported.
The content regulations will apply to all platforms, regardless of whether some already have pre-publishing checks. Several platforms have implemented active content control and can refuse to publish certain content or inform users that it is not permitted and violates their rules.
The biggest online platforms and search engines will be required to produce content moderation reports and grant the European Commission access to any data needed to monitor compliance with the Digital Services Act.
Failure to comply might result in fines of up to 6% of their global revenue. For example, corporations like Meta and Google have declared yearly revenues ranging from 120 to 260 billion US dollars, implying that the 6% punishment would amount to roughly $10 billion.