The Digital Services Act in Action: Evaluating Platform Transparency and User Safety

*This article was written with the assistance of an LLM.

Read the NEW Comparative Study in this additional article.

In the long and distant past, in year 2021, I contributed to a research project and a newly formed coalition at the World Economic Forum. This WEF whitepaper, I contributed to, was titled “Advancing Digital Safety: A Framework to Align Global Action.” After the release of the paper, a new coalition ,dubbed the Global Coalition on Digital Safety, launched to support this mission. The whitepaper came in a space and time when the EU’s Digital Services Act deliberations were very much in full thrust and conversation. Informally, the whitepaper aimed to produce an image of what a standardized format or metrics for “digital safety” may be from a platform-agnostic POV. As new digital service hosts were releasing new content moderation and other forms of “online safety” transparency reports during the Covid years, you may ask, as a user, a democracy stakeholder, what numbers of reporting standards might help illuminate how are the tech and social platforms impacting not just discourse online but also the “social” internet in general.

As a multi-stakeholder initiative, the coalition unites representatives from the public sector, private sector, academia to develop global principles for digital safety, toolkits for safety design innovations, the establishment of a digital safety risk assessment framework, and the promotion of best practices in media information literacy. The coalition published several other reports such as typology (PDF) of digital harm and online threats, and digital safety risk assessments (PDF) bringing together a plethora of informed opinions and expertise from the industry.

In “Advancing Digital Safety: A Framework to Align Global Action,” the WEF team and contributors underscored the critical need for establishing a user-centric, rights-based framework to ensure online safety, identifying a glaring absence of safety baselines and informed participation amongst what the digital platforms put out in their transparency reports and the advance of online safety frameworks or regulations. During the research and writing phase of the paper, digital or social platforms were governed by a patchwork of national, regional, and international laws and the moral and practical imperative was for private industry to step up and create supportive environments for user safety, civics, civic integrity and protection against fraud and scams.

The DSA is a monumental legislative achievement within the European Union to regulate digital platforms, ensuring a safer and more accountable online environment and most importantly bring content and account moderations actions under one umbrella, one database — the new DSA Transparency Database. Since it came into effect in August 2023, the DSA has mandated a new standard of transparency, particularly in the areas of content moderation, request for access to information and handling user complaints and notices on the platforms. The DSA’s transparency reporting and disclosure requirements are poised to fundamentally alter the landscape by ensuring that platforms are not only more vigilant but also more forthcoming about their practices. Since September 2023, all the designated Very Large Online Platforms (VLOP) have submitted their transparency reports based on actions taken within the Union’s borders

The initial reporting under the Digital Services Act (DSA) is a stark example (examples shown below) for enhanced comparability and clarity in content moderation across the digital space — an argument that was advanced in the June 2021 WEF whitepaper and that the researcher community is still expecting or pushing forward. The EU may implement standardized metrics for all major online platforms, enabling direct comparison and trend analysis especially as researchers will turn to download and conduct research based on the data available via the transparency database. Harmonizing definitions and streamlining reporting, such as specifying the number of content moderators per EU language, would make data more uniform and accessible.

Tremau also maintains a database of the DSA-mandated transparency reports by the designated Very Large Online Platforms (VLOP).

Insights from the first DSA transparency reports
One key aspect in these new DSA transparency reports and the legislation in general is the emphasis on “user empowerment” and protecting access to these digital services. In this exploratory and comparative study, I will only examine the transparency reports of 6 platforms. My interests lie in the platforms that facilitate “engaged” speech, possibly what we have come to describe as the “digital commons.” These platforms capture both the malaise and discord in society, in text, image, videos and creator content and also have the ability and social features/affordances to create, nurture and amplify organized groups, political discourse and influence campaigns.

I am also including Wikpedia. Wikipedia is the only “encyclopedia” platform where it both plays a robust and long-living role in informing the public about political discourse, events and especially upcoming and past elections. It also uniquely promotes a hybrid platform-community consensus system with regards to content management (e.g. content removal), complaints-handling and styming influence campaigns that could transforms what would be benign informational Wikipedia articles into political or (ideological) discourse or influence operations.

So, these platforms will be:
Instagram
Facebook
X
Tiktok
YouTube
Wikipedia

So far, through examing every single DSA report, there is maximum emphasis on specific articles that address the transparency reportings though they are often addressed in different orders. These articles (from the DSA) are:

Authority orders’:
– Article 9
– Article 10

These two articles refers to the number of orders received by the digital platforms services from each EU member state (its designated DSA coordinator) to either address a piece of content (e.g. removal, geoblack; article 9) or request information about the “recipient of the service” i.e. the user (article 10) via a legal proceeding (e.g. IP infringement, privacy, defamation, etc.)

Content Moderation
– Article 15
– Article 23

Article 15 refers to the extent (volume) or category of removed content and also the mechanism by which the content was disabled whether it was an automated process or a human review/manual review. Article 23 mandates transparency on content moderation and user suspension actions for posting “manfietedly” illegal content (per global and local laws in the EU).

“Notices” and complaint-handling systems
– Article 16 & 17

Article 16 is the “notice” part. Platforms must notify users of the presence of illegal content. Under Article 17, platforms must inform and notify users of actions taken with regard to the content they posted or account, service access and the reason for this action whether it is based on a contractual agreement (i.e. their terms of services) or the law. The article also established the right to an “appeal” mechanism. This “appeal” or complaint right has been adopted by several large companies since possibly their inception and has been central to the “Santa Clara” principles, established in 2018.

In the above DSA reports, platforms posted the number of complaints they received and the number of upheld moderation decisions versus the number of moderation decisions that were overturned. It is important to mention that automated services are not always accurate and cannot always capture the linguistic or historical, political nuance, blindly. The ratio or the proportion of overturned moderation decisions versus upheld decisions or the number of complaints in total may indicate in fact the level of accuracy of the platforms’ moderation decisions, automated be they or manual.

Read the Comparative Study in this additional article.

Prev PostNavigating Global AI Regulation Frameworks: Balancing Rights, Redress, and Responsibility
Next PostThe State of Online Speech in the EU: A Comparative Study of DSA 2023 VLOP Reports

Leave a reply