On May 15, 2019, two months after the horrendous Christchurch massacre, policymakers led by New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron came together to found the Christchurch Call Community. This organization contains more than 120 member countries, tech companies, and civil society organizations. Its laudable intention was to hold tech companies to account for extremist content uploaded onto their platforms.
However, as it stands, almost four years to the day since it was founded, the Call has fallen short of its laudable goal to stop the proliferation of extremist content online.
Research from the Counter Extremism Project (CEP) has identified significant cases in which extremist content has been uploaded and re-uploaded.
For example, as recently as April 2023, CEP investigators unearthed Twitter accounts that not only glorified the very horrendous attack that led to the founding of the Christchurch Call Community but also posted clips from the attack video itself. Less than three weeks prior, a separate CEP investigation exposed extremist accounts on Telegram and Instagram that were celebrating the fourth anniversary of the attack. Researchers also located the full version of the live-streamed attack video on a library download site.
The scope of the problem extends beyond just the Christchurch attack. Separate CEP reports identified that the bomb-making video which instructed the terrorist behind the Manchester Arena bombing in 2017 was still being circulated online three years later and that white supremacists shared a video of the gruesome Buffalo attack on Telegram and AnonFiles. It was also spread on Facebook, Twitter, Vimeo, BitChute, and Streamable.
These staggering examples of the worst terrorist propaganda and violent content being available online, underscore how the appointment of Jacinda Ardern as Special Envoy to the Christchurch Call is a unique opportunity for a course correction.
Most importantly, it is time for a wholesale change to the regulatory approach initially adopted when Ardern was serving as Prime Minster and Christchurch co-chair. Indeed, the Tech industry is indulged with a unique regulatory oversight regime that is granted to no other sector. In no other industry are companies requested to voluntarily comply with rules established for the safety of the public, with minimal legal or financial consequences if they fail to do so. The beginning of the Christchurch Call should have meant the end of the permissive ‘self-regulatory’ environment enjoyed by Big Tech.
In her new role, Ardern should push for clear, decisive, steps to hold tech companies accountable for terrorist content that is uploaded onto their platforms.
Pushing for legislative action will be key. A concise set of regulatory measures must be enacted to curb the current spread of violent extremist content that is circulating online. The laissez-faire approach to date has shown that self-regulation and voluntary commitments are not viable options. We can no longer simply hope Big Tech changes its ways: there must be consequences for inconsistency and apathy towards content removal. It now seems only standardized rules will be able to force tech companies to eliminate extremist and terrorist content online.
The Christchurch Call is a commendable initiative that can fulfill its initial promise. We cannot continue to sit back and hope the problem of online extremist media and propaganda solves itself, and the Call is an important tool to ensure responsible content removal policies and accountability for Big Tech. One hopes Ardern’s critical attitude towards the industry is carried over to her new role.