EP plenary session 10/02/2021 – Democratic Scrutiny of Social Media and the Protection of Fundamental Rights Photo: Riccardo PAREGGIANI / European Union 2021, EP
Most speakers focused on the need to provide legal certainty when removing content, and to ensure that such decisions lie with democratically accountable authorities.
In a debate with Secretary of State for European Affairs Ana Paula Zacarias from the Portuguese Presidency of the Council, and Commission Vice-President Věra Jourová, almost all speakers criticized the vast power of social media platforms and their worrying impact on politics and freedom of speech.
Citing various decisions taken by the platforms to censor content or accounts, a large majority in the chamber highlighted the lack of clear rules governing such decisions and the lack of transparency of big tech practices.
They urged the Commission to address the issue in the Digital Services Act and the Digital Markets Act, and as part of the Democracy Action Plan. Most speakers focused on the need to provide legal certainty when removing content, and to ensure that such decisions lie with democratically accountable authorities, and not with private companies, in order to safeguard freedom of speech.
Other topics raised included:
the need to defend democracy and EU values by tackling disinformation and increasing efforts to subvert them or incite violence;
technology being used to enhance rather than limit political discourse, while addressing the issue of proliferation of hate speech and discrimination online;
algorithm transparency, use of personal data and the restriction (or ban) of micro-targeting and profiling practices to fundamentally alter the business models of tech giants;
the problems caused by the emergence of tech monopolies and their impact on media pluralism, as well as on pluralism in public discourse;
the false dichotomy between the online and offline spheres and the need for rules that cover all aspects of life; and
the systemic risks, as well as the societal and economic harm, that major platforms can cause or exacerbate.
In October 2020, in its recommendations on the Digital Services Act, Parliament stressed that the responsibility for enforcing the law must rest with public authorities, and that decisions should ultimately lie with an independent judiciary and not with a private commercial entity.
The 2019 European Elections were protected from disinformation through an EU action plan and the European Commission’s code of practice for platforms. However, in the context of the Democracy Action Plan, the Commission has confirmed that self-regulatory measures need to be replaced with a mix of mandatory and co-regulation measures to appropriately protect the fundamental rights of users and regulate content moderation.
Parliament has also spoken out against the deterioration of fundamental rights, the worrying state of media freedom in the EU, and online disinformation campaigns by foreign and domestic actors.