In this episode, Sarah Diedro is in conversation with Senior Policy Advisor at European Digital Rights (EDRi) Jan Penfrat explains how the DSA helps to shift the online ecosystem away from ad-driven centralized mega platforms, and towards a more decentralized version of the internet.
From pervasive hate speech to large scale surveillance advertisement, the Internet is not a safe democratic space anymore. Green politicians and NGOs are working hard to create a regulation for the Internet that is fit for the 21st century.
Today, much of the political debates that takes place is held in privately regulated digital platforms, which play a key role in our democracies. Their policies dictate whose voices get amplified – or excluded – online.
Several scandals have already revealed how this impacts our societies. A Facebook leak revealed that ‘high profile users’ such as celebrities are ‘whitelisted’ and given special treatment.
at Facebook turned a blind eye to the genocide in Myanmar by amplifying hate speech and not taking down inflammatory posts. Notably, the Cambridge Analytica scandal revealed the extent of the use of personalised data to manipulate public discourse during elections.
The EU is taking the lead on setting global rules to have a more democratic internet. The proposed Digital Services Act (DSA) would act as a fundamental law for digital platforms in Europe. It is not the first piece of EU legislation regarding digital technologies, but the last EU legislation on this topic (the Electronic Commerce Directive 2000 that has regulated the internet so far) is more than 20 years old. It is the most ambitious legislation in the entire globe and relevant countries such as United States or India are looking into it with great interest.
However, the legislation process is not finalised yet. The European Parliament and the European Council have started the negotiations to come up with final version. To know more on the state of negotiation, you can follow Alexandra Gesse, the shadow rapporteur of this legislative file in the Committee on the Internal Market and Consumer Protection of the European Parliament. in the Committee on the Internal Market and Consumer Protection of the European Parliament.
Clear rules for democracy in the digital age
The Digital Services Act sets out an important precedent in dealing with illegal content online. Previously, a digital platform company could arbitrarily moderate its content. It decided what legal, and even illegal content, would be taken down or stay up on the site without any accountability for their decisions. This has especially affected women and the LGBTQ community. It also forces platforms to give access to their data, for example for research purposes.
Once it is adopted, digital platforms will have to abide by clear rules. If an authority issues an order against a piece of illegal content, the platform must follow suit. Public authorities can also issue an order of information, for example concerning a hate speech incident.
This process can help to prevent the damage done by online defamation, disinformation, or hate speech which often becomes amplified as big social media platforms business models are based on maximising engagement. Alexandra Geese explains why the current internet needs regulation:
“The times where everybody could speak freely on the Internet are long over. Because today, the freedom of expression on the internet is basically the freedom of white men (…) A lot of women are pushed out of the public debate in the internet because of the enormous amounts of hate they receive.”
The Digital Services Act will hold platforms accountable and shape a fairer and more diverse internet. The Digital Services Act is also set up to make up for some of the pitfalls of the General Data Protection Regulation (GDPR) – the most crucial of which has been the cookie banner not working to protect users due to loopholes discouraging users from setting their preferences. It aims to give power back to the users so that they can determine how they want to use online tools and what kind of content they want to be exposed to, but without overburdening individuals with too many choices. The current proposal establishes that it must take the same time to accept all cookies than rejecting them.
How surveillance advertising is dividing society
Surveillance advertising, also known as targeted advertising or behavioural advertising, refers to the practice of showing users advertisements based on their ‘profile’ – all the information that a company has gathered about them and sells to advertisers. This includes aggregated data about a person’s location, job and interests, but also more wide-ranging information like what the company assumes about you based on your preferences and characteristics. Basically, every detail about who you are can now be monetised by companies and used to influence you online.
This very pervasive surveillance practice has been based on the premise that was sold to us by big tech that more data means better advertising. It has since led to what some have called the ‘big data goldrush’ and has created marketing careers that have been based on this for the last 20 years. But the ones benefitting from this system are the big data brokers with access to personal data: Meta and Google for example.
It is nevertheless predicated on unethical practices that not only harm citizens, but democracy itself. Jan Penfrat elaborates on why this kind of targeted messaging shouldn’t exist:
“It’s much more than just commercial advertising. It’s not only about whether a shoe company can show you a specific shoe ad, or a car company can promote their latest model. It’s about anybody who has money can being able to push any type of political or issue-based message to a pre-selected targeted audience. Using the vulnerabilities or specific sensitivities of that audience against them, basically. And that leads to more division in society. It leads to manipulation at a very large scale.”
Using this information, algorithms on digital platforms also contribute to an ever-more polarising public sphere, and at the same time push to radicalise users. This is because they are geared towards inciting reactions from users, usually by pushing content which generates fear and anger. This can push users to follow incendiary or controversial content and ending up in a ‘rabbit hole’.
To counter this trend, the Digital Services Act imposes a ban on targeting for advertising purposes for minors, and a ban for sensitive data according to the General Data Protection Regulation (GDPR). Alexandra Geese explains: “we will still see ads, but without the platform’s having all these profiles and all this knowledge about us, which is extremely dangerous.”
Jan Penfrat talks about the potential outcomes of such a legislation, hoping that it will mean a more ‘human scale’ internet, with community-led initiatives: “We can have a whole different online environment that allows for, for people to move into use tools in their interest, rather than in in the interest of cooperation. This is really something that we’re hoping for, for the future.”