The DSA’s biggest W
There’s been much talk in Europe in recent weeks of our declining competitiveness, our stifling bureaucratic superstate which is trying to regulate every new innovation left, right and centre, while the US and China embrace risk, uncertainty and create the future. It’s hard not to think about the DSA amidst such conversations and jump on the bandwagon of Brussels-bashing.
Contrarian that I am, I thought we’d do a different take this week and talk about what in my humble opinion is the single greatest gift that the DSA bestows upon European citizens. It is not the mechanism for reporting illegal content mandated by Article 16, or the right to appeal on VLOPs enshrined by Article 20, or the obligation on platforms to conduct risk assessment.
It is the first paragraph of Article 17: Statement of reasons, which in DSA parlance refers to the notifications that uploaders receive whenever their content is moderated or their accounts are restricted. Article 17 makes several obligations on what constitutes a statement of reasons which re-defined standard industry practices pre-DSA. We have already written about the pre- and post-DSA moderation world but a case in point here is the communication to users of whether their content had been proactively detected, or auto-moderated.
One can of course rightly wonder how much users actually care about whether their content was actioned by a human or an algorithm. Most users whose content is removed care about the outcome and possible remediation, ie. being able to successfully appeal.
The true win from Article 17 lies in the following lines:
[Platforms] shall provide a clear and specific statement of reasons to any [user] for any of the following restrictions [...]:
any restrictions of the visibility of specific items of information provided by the recipient of the service, including removal of content, disabling access to content, or demoting content;
suspension, termination or other restriction of monetary payments
suspension or termination of the provision of the service in whole or in part
suspension or termination of the recipient of the service's account
Pay particular attention to the following words: any restriction, demoting, in part. Why is this important? Because in the pre-DSA world, platforms did not notify users about all restrictions, and often omitted to notify users when their content was demoted, or the visibility of their account was restricted.
This practice, popularly referred to by the nebulous and ill-defined term ‘shadow-banning’ but which we at Pelidum prefer to term ‘covert moderation’, brought with it several undesirable consequences:
It gave (and arguably continues to give) rise to deep mistrust of platforms and fuels conspiratorial narratives that platforms are pushing or censoring partisan viewpoints
It gave platforms an easy option to remove or bury content that they were not prepared to publicly declare violative but did not want to see propagate on their sites
It left users confused, angry and helpless if their content was moderated in this manner; essentially platforms could gaslight users and simply pretend that nothing was happening. Whether you believe or not that the content deserved to be restricted, it’s hard to find a robust argument for doing so covertly
Article 14 adds further weight to this obligation when it states that
Providers of intermediary services shall include information on any restrictions that they impose [...] in their terms and conditions
In other words, platforms must be transparent about their policies and should not impose restrictions based on terms and conditions that they have not previously made public to their users.
In combination, articles 17 and 14 constitute a considerable win for European users and afford them greater rights and protections than the rest of the world. They de facto outlaw shadow-banning, thereby mitigating the risks of platforms employing covert moderation practices. It is a win for European democracy and corporate accountability, and an aspect of the DSA that I wholeheartedly endorse.
Are there parts of the DSA that I do not endorse, I hear you ask yourself? That’s a topic for another day: follow our page to keep abreast of our updates.
And as ever, get in touch if your team needs a hand with Trust & Safety.
External links:
https://newsroom.tiktok.com/en-eu/fulfilling-commitments-dsa-update
https://blog.google/around-the-globe/google-europe/complying-with-the-digital-services-act/
Liam Melia is the Managing Director of Pelidum Trust & Safety
© Pelidum Trust & Safety 2024