Buffalo Shooting should enforce billing for tech platforms

0
Placeholder when loading item promotions

Aaron Salter Jr. Heyward “Tenny” Patterson Pearl “Pearly” Young Celestine Chaney Ruth Whitfield Andre Mackniel Margus Morrison Geraldine Talley Katherine “Kat” Massey Roberta “Robbie” Drury

These were the reported victims of a racially motivated shooting in Buffalo, New York, on Saturday, committed by someone seeking social media glorification, and who will receive no name check in this article.

Years after similar attacks, tech companies still play a role in spreading extremist ideas – from hosting the forums that radicalize young men to helping them popularize them online in a burgeoning white supremacist movement. New laws from Europe forcing tech companies to make their websites more secure might not be coming fast enough, although some stumbling blocks may stand in their way.

The Buffalo shooter used online platforms to make his plans come true. He orchestrated his attack on Discord Inc., posted his manifesto on Alphabet Inc.’s Google Docs, and then livestreamed the shooting on Amazon Inc.’s Twitch. A community of supporters then shared clips from his video on Facebook, Twitter Inc., and Instagram other platforms by Meta Platforms Inc..

This is how the feedback loop works between mainstream social media and underground forums, driving people to extremist ideas like the racist “Great Replacement” theory.

In his manifesto, the attacker credits the imageboard to 4chan, where over the course of two years he “learned through infographics, shitposts and memes that the white race is dying out.” With around 20 million active users, 4chan isn’t big enough to be subject to upcoming European regulations on dangerous content, but other sites like this have been taken offline when their infrastructure providers came under too much political pressure.

More than a decade ago, 4chan was best known for creating trends in internet culture and sparking the rise of the hacktivist network Anonymous, which cyberattacked organizations in protest. Today it has morphed into something more menacing, a breeding ground for far-right propaganda and bigotry, with regular discussions of real-world violence.

It would be easy for the big social media companies to pinpoint 4chan as the real problem, where most of the radicalization happens. But clips from a live stream of the shoot on Saturday went viral on Facebook, Twitter and Instagram and were seen by millions. A link to the video was shared on Facebook more than 46,000 times, according to The Washington Post, and the company hasn’t removed it for more than 10 hours. Facebook said people tried to circumvent its rules to share the video, while Twitch said it removed the video less than two minutes after it streamed. Discord said it removed the shooter’s private server immediately after learning about it.

The attacker himself was a copycat, following a basic formula of others that have gone viral on social media. His decision to live-stream himself on Twitch emulated previous attacks, such as the 2019 mass shooting that killed 51 people at a mosque in Christchurch, New Zealand, and was live-streamed on Facebook. About a quarter of his manifesto was plagiarized, with the main source being the Christchurch terrorist’s manifesto.

After 2019, technology companies urgently tried to stop the distribution of such videos. One strategy the companies use is to assign a mathematical string, or “hash,” to a video and then use algorithms to track down the clips. But these efforts have only partially worked.

Videos of hundreds of thousands of views of the Buffalo shooting could still be found on Facebook, Instagram and Twitter over the weekend, according to Ciaran O’Connor, an online extremism researcher with London’s Institute for Strategic Dialogue. O’Connor said the videos he found with the most views were on Twitter, with one reaching nearly 460,000 views. Elon Musk has yet to say anything about the shooter’s video or how he would handle its spread on Twitter if he bought the platform.

“The danger of the video staying online longer is that it allows people to download the content and put it on other sites,” O’Connor said.

For example, in the month after the Christchurch video went viral, netizens recorded a short clip of the gunman pointing his assault rifle at a man standing outside the mosque, ending just as the gun was fired and turned it into an endless GIF which became popular in several forums. Others have superimposed video game images over the original video, O’Connor said, while some extremists have created rudimentary versions of the mosque in online games Minecraft and Roblox to play out the attack.

It’s possible the Buffalo’s terrorist might not get the same kind of adoration as others before him. But his online legacy will contribute to the same contagion that reached him in the first place.

A big part of the solution to extremist content online is regulation, but it’s not coming from the US, where politicians and technologists appear to be doing the opposite of trying to curb harmful content. Elon Musk, for example, has proposed relaxing content moderation rules on Twitter to allow more “free speech” on the platform if he buys it.

The First Amendment doctrine also prevents US lawmakers from banning almost any speech, including racist and homophobic comments, paranoid conspiracy theories, and manifestos by gunmen. To push this further, the state of Texas passed a bizarre new law called HB 20, opening up the possibility to sue social media companies that remove content, in an attempt by lawmakers to curb alleged censorship. The legislation is being challenged by tech industry groups who say it could allow hate speech to spread more widely on social media. It could even prevent platforms like Twitch from shutting down a live stream of a mass shooting in the future. The Supreme Court prepares to decide whether the law is constitutional.

The world’s greatest hope for eradicating extremism from mainstream social media is coming from Europe, and it comes from two new laws – the UK’s Online Safety Act and the European Union’s Digital Services Act. The rules, coming into effect over the next few years, will force tech companies to conduct regular risk assessments of their algorithms and curb malicious content faster – or face fines of 6% of their global revenue. Facebook whistleblower Frances Haugen said Europe’s DSA could set the global “gold standard” for technical regulation.

4chan, the imageboard that radicalized the Buffalo shooter, has a loophole here: the EU’s DSA applies to online platforms with 45 million or more regular users, and 4chan is about half that size. But it relies on mainstream web businesses to stay online — without that support, 4chan might have a harder time attracting as many visitors as it does.

For example, after two different racially motivated mass shooters posted manifestos on the 8chan(1) imageboard, web security company Cloudflare suspended its cyber protection services for the website, resulting in it being taken offline. Several other web infrastructure companies have also discontinued their services, and 8chan no longer shows up in Google searches. Neo-Nazi website The Daily Stormer was also shut down by mainstream services in 2017.

It may be impossible to completely remove such sites from the internet, but making them harder to find can help keep more young men out of extremist rabbit holes. That means their mainstream technology providers are also to blame. You should act accordingly.

More from the Bloomberg Opinion:

• Omicron turns out to be a weak vaccine: Lisa Jarvis

• Wartime Brexit threats are doubly wrong: Lionel Laurent

• Elon Musk misses the big picture on lithium mining: Anjani Trivedi

(1) The shooter from Christchurch, NZ published his manifesto on 8chan in March 2019. A mass shooter who targeted a Walmart in El Paso, Texas, posted a similar White supremacist manifesto on the website the following August.

This column does not necessarily represent the opinion of the editors or of Bloomberg LP and its owners.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for The Wall Street Journal and Forbes, she is the author of We Are Anonymous.

For more stories like this, visit bloomberg.com/opinion

Share.

About Author

Comments are closed.