EU asks big tech companies to help with security for upcoming elections

The European Union has asked major tech companies to help with security in the upcoming elections amid concerns about disinformation and hacker attacks. This is reported by Politico.

Under the new law on content moderation, the Digital Services Act (DSA), platforms like Facebook, X, YouTube, and TikTok will have to step up their efforts to combat disinformation and fake news before the European Parliament elections.

Companies could face a fine of up to 6% of their annual global revenue if they fail to comply with the law.

“We know that this electoral period, which opens in the European Union, will be subject to either hybrid attacks or foreign interference of all kinds,” said Internal Market Commissioner Thierry Breton.

Platforms such as Facebook, YouTube, and TikTok are required to ensure the proper labeling of political advertisements and AI deepfakes. Another requirement is to establish dedicated teams with adequate resources to monitor emerging storylines and potential threats in some of the 27 EU member states.

The Commission has announced new rules to reduce the dangers to elections, such as hoaxes going viral and coordinated Russian bot campaigns or fake media. The Commission enforces the law on two dozen very significant internet platforms and search engines.

On June 6–9, hundreds of millions of Europeans will vote to choose new representatives to the European Parliament from all around the European Union.

The European Commission advised social media platforms to set up backup plans in case a deepfake featuring a well-known European figure spreads widely on their platforms and to implement mechanisms like pop-up alerts to deter users from sharing disproven postings that contain false material.

Platforms must ensure that their algorithms promote a diverse range of content. Research indicates that certain platforms, like YouTube, disproportionately promoted far-right, anti-immigration films during Finland’s presidential elections earlier this year.

Large social media platforms, such as Facebook, have come under fire for not moderating enough information in lesser-spoken European languages like Slovak. X has reduced the size of its content moderation teams and is currently the subject of an official investigation.

Major digital corporations must comply with an array of regulations under the DSA in order to combat unlawful and harmful content. These duties include providing thorough assessments and mitigations of significant societal hazards, such as threats to elections.

The guidelines are recommendations made by the Commission for adhering to the DSA rulebook. Although businesses can utilize them anyway they see fit, those who choose not to comply with EU recommendations “must prove to the Commission that the measures undertaken are equally effective,” an EU official stated in a press statement.

Additionally, the Commission announced that it will be conducting “wargaming scenarios” or “stress tests” with a few important platforms by the end of April. Businesses including Meta, TikTok, and X have already requested that the Commission’s enforcement team supervise voluntary exercises of this nature to ensure that their operations adhere to the DSA.

Read all articles by Insight News Media on Google News, subscribe and follow.
Scroll to Top