Anchoring the EU’s Digital Services Act (DSA) and the Digital Markets Act (DMA) in human rights

Digital environment/Pixabay


Anchoring the EU’s Digital Services Act (DSA) and the Digital Markets Act (DMA) in human rights is vital to making our virtual public squares safer and accessible to all:

  • Enhancing platform transparency is a pre-condition for improving accountability for individuals and the public at large.
    • The steps suggested for the DSA towards increased transparency requirements for platforms (such as mandatory transparency reports, clear rules on access to data for researchers, and explanations of recommender systems) would all constitute significant improvements. 
    • These obligations must also expand access to data for a larger group of researchers and civil society watchdogs.
  • Refrain from incentivizing over-policing of expression.
    • Avoid extensive liability on platforms for content shared by their users and requiring deletion of content within short deadlines, as they are likely to err on the side of caution and “over-censor” and delete large amounts of lawful expression.
    • Obligations for general monitoring, often automatized, would likely lead to overbroad restrictions while not necessarily catching the most problematic content.  
  • Encryption is critical for guaranteeing minimal levels of safety and freedom online.
    • Requiring communications services to scan encrypted communications would threaten privacy, which is particularly critical for the work of human rights defenders and journalists, and significantly reduce the security of communication systems.
    • Rather than seeking to undermine safe communications, the EU should actively promote access to and use of end-to-end encryption.
  • Adequate checks and balances must accompany enforcement powers vested with public authorities to prevent abuse.
    • Refrain from empowering public authorities to demand takedowns or other restrictions of expression without prior judicial approval. Empowering authorities to notify platforms about “specific content” to be prioritized is highly problematic. 
    • Private companies should not be mandated to make decisions on the legality of speech.
    • Oversight must be independent, impartial and free of any undue political or economic influence. 
  • Protecting online privacy is vital for rights-based regulation of online services.
    • Pervasive tracking of individuals across the web by some companies undermines the privacy and rights of all. 
    • Strong privacy protection, accompanied by robust remedy and transparency requirements are critical to limit platforms’ ability to track people’s behavior, draw inferences about them and base decisions thereon. 
    • Companies should be required to make profiling-based content recommendations dependent on users actively opting in, rather than activated by default.   
  • A pluralistic communications environment is vital for vibrant democracies and civic space.
    • Avoid costly or complex measures that only large, well-funded companies can afford so as not to enhance the dominant position of large multi-national platforms.
  • Align due diligence requirements with the UN Guiding Principles on Business and Human Rights.
    • Pay attention to the broad range of human rights impacts of companies’ products and services and how those can be prevented and mitigated rather than on companies’ compliance with government mandated or ordered take-downs.

We stand ready to work with the European Union to ensure that these fundamental human rights considerations inform the development and implementation of the EU’s digital framework.