What should the EU do to better protect democracy in the digital age?
New technologies have created incredible opportunities, including for freedom of association and to claim human rights.
Opportunities and risks
New technologies have created incredible opportunities, including for freedom of association and to claim human rights. But they also expose us to new risks. Digitalization may distort our view of reality and reinforce negative emotions. Algorithms have been found to discriminate against minorities. And large tech companies set the rules for participation that the users have to accept in order to be able to access their platforms.
“Democracy means full and meaningful participation. Hate speech, voter suppression in elections, disinformation and harassment prevent people from participating and we must respond to them, while not undermining the essential role of the internet as a place for the free exchange of ideas,” said Iverna McGowan, Europe Director of the Center for Democracy and Technology.
EU legislation of the digital sphere should ensure the safety of users online and protect human rights, like the right to privacy and freedom of expression. At the same time, it will create new responsibilities for large online platforms. Central to this discussion is how these new EU rules will help to better protect democracy in Europe and how tech companies can be regulated further without restricting the right to information.
New rules for the digital sphere
Vice-President of the European Commission Jourova explained how the EU is modernizing its toolbox to deal with the challenges: “The European Democracy Action Plan focuses on safeguarding the integrity of our elections, countering disinformation, and supporting free and independent media. Together with the Digital Services Act, which upgrades the liability and safety rules for digital platforms, services and products, it will create new rules for digital engagement, also for political purposes. We want to make sure that platforms are transparent about the way their algorithms work. We will make platforms more accountable for illegal content. We will propose rules to ensure greater transparency on how platforms moderate content, on political advertising and on algorithmic processes.”
Peggy Hicks, Director of Thematic Engagement, Special Procedures and Right to Development Division of UN Human Rights highlighted the importance of process: “We need to work towards minimum standards for the process of content moderation rather than focusing on the content of the speech. And we need to make those processes easier to understand, fairer, non-discriminatory and effectively challengeable. We need to make sure that the regulatory bodies that we put in place are independent as well.”
All eyes on Europe
Digital platforms have become an essential part of our lives, even more so during the COVID-19 pandemic. The Digital Service Act will have an impact beyond the EU as countries around the world are legislating the digital sphere, with thirty-nine countries already having adopted new laws, and twenty-nine new laws in the pipeline.
Therefore, it is essential that EU legislation is based on human rights standards and fosters a healthy digital ecosystem where the rights of users are prioritized. As the UN Special Rapporteur on freedom of opinion and expression said Irene Khan: “The same rights that people have online, should be protected offline. You can see the dangers to democracy when human rights are disconnected from the digital space, because the digital space then becomes a weapon against democracy.”