No products in the cart.
Press the play button to listen to this article
The European Commission is expected to require Facebook, Google, and Twitter to change their algorithms – and show they did – to stop the spread of lies online, according to three people who were informed of the proposals posted on To be published Wednesday.
Under the new rules, which will have to be negotiated with the world’s largest social media companies after their publication, companies in Brussels must also disclose how they react to the spread of disinformation on their platforms. What action do they take to remove or disparage certain content or accounts that promote falsehood? and offer online users more transparency about how they are being addressed with digital ads.
The measures would be the furthest country or region where tech companies were forced to reveal the insides of the algorithms used to populate social media feeds. These machine learning tools have been criticized for promoting viral hateful or bogus content, including material related to the COVID-19 pandemic, through more mainstream sources. The companies deny wrongdoing.
The upcoming announcement, which the three people spoke about on condition of anonymity because they were not allowed to speak publicly, is part of a revision of the Commission’s so-called code of conduct on disinformation.
The voluntary pact was signed in 2018 between Brussels and the world’s largest social media players. An audit by the European Court of Auditors, a body that looks at how EU funds are being spent, is expected to be published in a report to be published next week. The current deal does not blame the platforms for their role in spreading disinformation.
The Code of Conduct aims to provide more transparency on how companies tackle online untruths, first before the 2019 European Parliament elections and now during the ongoing pandemic, by asking companies to post regular updates on how to tackle misinformation .
These rules are now being rewritten before the block’s Digital Services Act, a series of separate proposals that address harmful online content and the sale of illegal goods. They include fines of up to six percent of annual sales for companies failing to stop distributing or selling such materials online.
As part of this structure, the largest social media companies have to publicly assess vulnerabilities in their online systems, including their algorithms. Planned actions under the Digital Services Act include an external review of how businesses are trying to stop the spread of misinformation and an increased role for the Commission and national regulators in monitoring potentially bad behavior.
The revised Code of Conduct, due to be released on Wednesday, will again be voluntary until the Digital Services Act is expected to become law in two years’ time. However, it will include measures that will ultimately be used to comply with the Digital Service Act, including disclosing how online falsehoods spread online and how many account platforms have been removed or downgraded.
If, according to two of the people announced on Wednesday, social media companies join the code, the standards will enable them to demonstrate that they assess and mitigate the risk of spreading online falsehoods on their platforms, thereby avoiding harsh penalties . If they departed from these obligations, they would be liable for fines of potentially tens of millions of euros when the Digital Services Act comes into force.
With the proposals set to be released on Wednesday, social media companies will face greater restrictions on how they allow advertisers to target people online through digital ads. This includes the requirement to post more data on how these paid messages can identify people online and to allow advertisers to better understand what online content is serving their ads.
The Commission is expected to announce these far-reaching proposals on Wednesday, but has yet to discuss the details with companies, many of whom have declined to give outside groups better visibility into how their algorithms are working or the online diffusion of online To allow falsehoods.
The Commission and Twitter declined to comment. Google looked forward to discussing the new code of conduct with Brussels. A representative from Facebook was not immediately available to comment.
UPDATED: This article has been updated to include information on the upcoming European Court of Auditors report.
This article is part of POLITICOPremium Tech Insurance Cover: Pro Technology. Our expert journalism and a range of policy intelligence tools enable you to seamlessly seek, track and understand the developments and stakeholders influencing EU tech policy and making decisions that affect your industry. E-mail [email protected] with the code ‘TECH’ for a free trial.