The DSA’s biggest L
My favourite definition of stupidity has nothing to with IQ tests, emotional intelligence, memory, abstract reasoning or other such measures; it is as as follows:
‘A stupid person is somebody who harms other people without benefiting themselves.’
I like this definition because it implies the following:
Stupidity is not an inherent quality; all of us may act or behave stupidly, no matter how knowledgeable or rational we may consider ourselves to be.
Applying this mantra to regulation, I would word it as follows:
‘A bad regulation is one that increases costs to businesses without producing a proportionate benefit to users, customers or to wider society.’
The DSA is a vast piece of legislation and if truth be told we will not know for some time whether some of its obligations will be a net benefit to society. A case in point is Risk Assessments and Audit. The rationale for these obligations is clear, the burden on platforms is significant, but we will not know for some time whether or not these obligations truly translate into safer and better online platforms for users. To complicate matters, even assuming platforms become safer and better, it will always be difficult to attribute cause and effect, and we will have to assume that post hoc ergo propter hoc in order to celebrate these measures.
I will therefore not speculate on these novelties and instead turn my attention to something far more banal: Article 20 and internal complaints mechanisms. In layman’s terms, this refers to the ability for uploaders and reports to appeal content moderation decisions.
We’ve written before that these mechanisms existed to a large extent prior to the DSA. However, I am not aware of any platform that offered internal appeals for six months after a content moderation decision was made. My experience is that three months was the industry standard, somewhere between 80 and 90 days. The DSA effectively doubled the window of appeal available to users, bringing with it important consequences for data engineering and UX teams, among others.
In my opinion there is no compelling reason for users to be able to appeal moderation decisions for six months after a decision has been made. The vast majority of uploaders will appeal moderation actions within 24 to 48 hours. After that the submission curve drops precipitously. I have not seen the hard data for the number of uploaders appealing post deletions 179 days after deletion but I would bet a significant amount of cash that it is way below 1%, even below 0.1%, likely even much lower on aggregate.
Extending these appeal deadlines involved significant uplift for teams, redefining industry practices for a region that accounts for less than 10% of the world’s population. The immediate benefit to European users is not evident to me; but the burden on platforms is palpable.
180 days to submit appeals is a clear example of over-regulation, or prescribing norms where the industry standards were already sufficient. This is not necessarily representative of the DSA in its entirety but if the EU is keen to better strike the balance between fostering innovation, encouraging economic risk-taking and reducing red tape, it must be more attentive to these details for they might needlessly tip the balance from creating a safer environment for users to an adversarial environment for innovation.
If I could wish for one thing in platform regulation, it is not for more or less. Businesses understand as much as anyone that new launches go through several iterations and plans rarely survive first contact with reality. Regulators should build in review and iteration cycles into their launches and be able to quickly pivot, scrap or add to regulations where particular obligations show themselves to be too burdensome or ineffective, or where the dreaded law of unintended consequences has come into play.
The EU should not strive towards perfect regulation; it should embrace iterative regulation.
Liam Melia is the Managing Director of Pelidum Trust & Safety
© Pelidum Trust & Safety 2024