The Persistent Dilemma of Content Moderation in Conflict Reporting

Updated Feb 19, 2024

*This article was written with the assistance of an LLM. The LLM was used specifically to synthesize long-winded thoughts and tighten up specific perspectives and arguments during the outline phase.

The Persistent Dilemma of Content Moderation in Conflict Reporting

Social media networks have long been flooded with violent content from conflict zones, which, in stages, prompted the evolution of content moderation and community guidelines on the most popular platforms today. This process, originally, aimed to safeguard users while respecting digital rights and free speech. Yet, content moderation is abundant with flaws

Why is this important? Well, any time there is a resurgence of violent videos and any content because of civil unrest or interstate violence, the same conversations resurface again about the online platform’s duties to moderate such content. The debate always centres around the need for hiding, de-viralizing content or outright content deletion — but human history belongs to everyone. Decisions to hide or delete such content must consider content policy trade-offs and how important this digital footprint is for humanity charting its paths forward, the importance of such content for peacebuilding and all the different stages of ensuring future and transitional justice.

Reader’s note: The Evolution of Content Moderation Rules Throughout The Years

The online dilemma of war reporting

When I started writing this article, the main topic of focus was the Israel bombarding campaign in Gaza, Palestine, against the Palestinian militant group, Hamas, and Palestinian civilians who fell victim to the now much-avoidable collateral damage. In the earlier days of the bombardments of Gaza, the availability of violent content on both sides of the conflict has surfaced a specific kind of backlash from regulatory authorities in the EU. The EU asked the four largest platforms, X, YouTube, TikTok and Meta to take down what they considered “illegal” content and content veering on disinformation under their jurisdiction. The ask was moot as it was outside the Commission’s Digital Services Act mandate. In a swift response, multiple digital rights organizations, global and from the Middle East issued a response and an open letter in defence of free speech. 

But this criss-cross of positions begs these questions: Who exactly should, can or must decide if these videos and this content should be allowed on online platforms which hundreds of millions of internet users rely on for instant, day-to-day communications and even access to online information? Should there be “universal” principles platforms and archival services and organizations, whose mission is to archive internet content, abide by when moderating violent content that emerges from scenes of war or conflict? 

Now, cut to the digital age where every story can live in a shroud of permanent maybe. Digital archivists strive to preserve the historical records of wars and armed conflict to inform current conflict mediation policies and the general public where support or opposition to war in several democracies in 2024 is heavily influencing electoral politics.

The “tradeoffs”

The “tradeoffs” refer to the seminal questions of preserving objectionable content which the largest tech platforms often end up carrying due to their global reach and the rules that should govern the preservation of this type of content which are complex and many. They are policy tradeoffs, product design tradeoffs and legal tradeoffs. These policy tradeoffs are subject to each platform’s terms and services and their desire to host such content, who are the players pressuring the platform to take it off or keep it on? 

New innovations at the intersection of online safety and trust

The perspectives and recommendations that follow in this article far transcend the issues of legal obligations and transparency. They are a bridge for commercial platforms and cultural institutions, public policy and new technology advocates to work together to preserve content that is of most importance to building peace and especially social peace online and between States.

Content authenticity credentials
To maintain the integrity of historical records from conflict zones, implementing a system of content credentials could be pivotal, ensuring that each piece of content is accompanied by metadata that certifies its origin, context, and veracity. Much like when you save an image to your iPhone and look through the metadata to understand the date, history, when and context. Content authenticity measures, including the C2PA Content Credentials’, led by the Coalition for Content Provenance and Authenticity (C2PA) provide digital media with a transparent “icon of transparency,” allowing users to verify the origin, creation details, and editing history of content. As AI manipulation continues to encroach on existing web content, of the present and the past, content ‘credentials’ will display to users a” digital nutrition label” to inform users about the provenance of archived content and its provenance. 

Ethical Guidelines for Citizen War Reporters
Universal and ethical industry guidelines for war reporting especially by the increasing majority of citizen journalists are not only critical for establishing trust on digital platforms but also for informing content moderation decisions about the content these reporters post online. The problem is while OSINT analysts and internet sleuths who form a diverse cohort of personas and institutions are gaining a massive following and are producing tremendous digital output, mainstream and legacy media remain far behind the trend. New and more guidelines on war reporting will reinforce and inform the technologies and reporting guidelines all all war reports and will advance trust in information and information integrity online

Universal Curation Standards for Conflict Content
Curation is the selective preservation, meaning not just the protection of content but also its contextual alignment within historical and ethical dimensions. It should direct us from data and metadata of the digital content to a comprehensive understanding of historical context. That’s the place where museums are. The fixation on controlling narratives via moderation and laws and public pressure has sparked the quest for “alternative facts,” or “mirror” worlds where most people live. The non-uniform practices of applying content moderation rules on political speech and war-related content have eroded the public’s trust in the impartiality of digital platforms especially because of obscure shadow-banning practices and uneven moderation of content.

Platforms Syrianarchive.org offer curation frameworks. Mnemonic’s methodology for curating the Syrian Archive is detailed online and informs their important archival works. It sheds light on its approach to collection, preservation, and the significance of such practices for research and legal contexts. More of these methodologies or a consolidation of curative and archival practices should exist universally and the adoption should be more industry-wide and even among digital and social platforms. 

In this concluding paragraph, our collective endeavour must be to find a path, common grounds, that honours the sanctity of historical truth, protects the vulnerable, and ensures the stories of war are preserved not just for the scrutiny of the present but as a step towards the justice in the future. Preserving digital archives is an honourable mission and creating new public fora for multistakeholders informed approaches to content moderation is a must to ensure digital platforms remain at an equidistance from both fact, equity and reconciliation.  t should involve crafting policies that not only balance but necessitate equitable trade-offs between content visibility, without compromising digital safety and ensuring authenticity, context and consent.

Looking into the future: 2024-2025 plans
Drawing on my extensive experience in organizing public fora for multistakeholder actors on the questions of content moderation policies and citizen participation, I am centring my work, my bandwidth, my time and the operational capacity and policy expertise I built in the past 6 years on interrogating the questions above and centring the role of new computational technologies to bring people together. In these next two years, I am working with several digital rights organizations, new tech platforms and institutional funders to bring forth new series for the “Digital Summit of Peace” in 2024 and 2025 in several regions of the world.

If you would like to participate in these conversations or present at these next summits, please reach out to me at info@axm.events. I will be looking forward to having a conversation with you.

Prev PostThe State of Online Speech in the EU: A Comparative Study of DSA 2023 VLOP Reports
Next Post

Leave a reply