The European Union yesterday ended an agreement on a new Digital Services Act, which will require online platforms to tweak content and make algorithms more transparent, at the risk of millionaire fines.
Find out all that’s been set and some of the tech giants’ commitments.
In a final round of negotiations lasting more than 16 hours, the European Agreement on this broad legislation, which complements legislation already approved in digital markets, comes nearly a year and a half after Brussels submitted its first proposal, in December 2020, and brings new commitments to service platforms. The Internet used by hundreds of millions of people in the European Union.
From now on, thousands of companies will need a European representative to operate in the group's territory, and remain under the auspices of this new legislative package, which is intended to be a new global standard against the spread of illegal content, disinformation and obfuscation. Algorithms that organize the content of social networks.
The tech giants - about 30 companies with more than 45 million monthly users in the European Union - will be under the direct supervision of the European Commission and will have to pay an annual fee of 0.05% on their global revenue to fund this monitoring, for which Brussels will hire new experts in the sector.
These tech giants will have to annually analyze and work to reduce their systemic risks, particularly illegal content that has negative impacts on fundamental rights, democratic processes, public safety, gender violence and minors, and content that has serious physical or mental health consequences. users.
The main tools to encourage compliance by digital giants are fines, the amount of which can reach up to 6% of the offending company's global trading volume.
What are the main objectives of the Digital Services Act?
In December 2020, the European Commission proposed a new legislative framework to tackle challenges such as selling counterfeit products, spreading hate speech and cyber threats, limiting competition and controlling the market. The basic idea behind the proposal is: what is illegal in the real world should be illegal in the online world as well.
Digital companies will be required to modify the content posted on them with "sufficient resources" and crack down on illegal content, something so far based on a non-binding code of practice that companies have voluntarily adhered to.
Users will have a clearer procedure for reporting illegal "online" content and platforms will have to act quickly to remove it, in addition to having to inform whistleblowers of the actions they have taken.
The new law will also prohibit the collection of data on race, religion, sexual orientation or other sensitive topics for ad targeting, as well as ads targeting minors or "interface" design techniques intended to deceive the user into allowing your data to be tracked.
In addition, major platforms such as Facebook or Twitter will have to give the Commission and member state authorities access to their algorithms, and in general, digital services should be more transparent about how information is identified. For example, whether they use filters or automate moderation of their content.
Brussels has been warning since before the start of the war in Ukraine, as well as during the Covid-19 epidemic, about the danger of spreading misinformation online and manipulating reality, a phenomenon that it now wants to combat with a new digital law. services.
Once an agreement is reached between the negotiators of the Council(s) and Parliament, both institutions will have to review the final agreement and give the green light again to the final agreement, which will enter into force 15 months after its publication in the Official Journal of the European Union or 1 January 2024.
“Wannabe internet buff. Future teen idol. Hardcore zombie guru. Gamer. Avid creator. Entrepreneur. Bacon ninja.”