Toxic Content Has a Simple Solution: Kill the Algorithm
By Mark Hurst • Apr 18, 2019

This article from New Zealand wonders what can be done to counteract the toxic Facebook and YouTube algorithms. These services make money by spreading hateful, violent content - since that's what keeps people "engaged" - that is, staring at the screen in fear, anxiety, or outrage.

As for what to do about it, the article lists various regulatory ideas:

• maybe social media companies should "publish transparency reports about harmful content on their services, and the measures they take to combat it" (British government)

• or perhaps Facebook should "add religion as a protected category in New Zealand, so that hate speech against religions is covered as a reportable offence" (New Zealand tech critic)

• or launch a "public education programme about the techniques of misinformation and radicalisation" (again, British government)

• or maybe companies should "withdraw advertising budgets from social media" (article author).

All good ideas worthy of trying out. But they treat the problem at the edges, rather than the root cause.

To put it as simply as possible: The problem is the algorithm. Thus the solution is to get rid of the algorithm.

Here's what I mean:

• YouTube should get rid of Recommended Videos.

• YouTube videos should no longer autoplay.

• Facebook should kill off the News Feed.

• Facebook should only show posts from people and brands the user has subscribed to, and only in a straight-chronological list. No algorithmic filtering or "bubbling up."

The only way for users to get to content should be if they specifically request it, search for it, or subscribe to it.

Facebook and Google would no doubt fight these improvements with every resource they have. Such strong resistance will be a good sign that we're onto the right solution.

The problems with Facebook and Google/YouTube aren't primarily "the bad people out there" who are exploiting the algorithm. The problem is that the companies themselves are built on toxic algorithms. As Siva Vaidhyanathan put it: "The problem with Facebook is Facebook." The problem is the algorithm, and the solution is to get rid of that algorithm.

It's not just Facebook and Google. In a similar conversation, CNN and others have wondered how to counteract the surveillance in an Amazon Alexa or Google Home device. (What settings should we use?) But that misses the underlying business model: these are surveillance devices, designed for one thing: surveillance. Asking Amazon really nicely not to surveil us, is not going to solve anything.

Similarly, telling Facebok to hire thousands more moderators to take down the hateful content (that their employer is monetizing) is not the solution. Maybe a patch, but it doesn't address the underlying problem.

If the algorithm is the problem, the solution is to kill the algorithm.

Update Apr 22: In the wake of the Easter bombings in Sri Lanka yesterday, the Sri Lankan government has shut down Facebook and other social media services across the country. In the NYT, Kara Swisher discusses why it makes sense, at least in the abstract, to shut down toxic services - though at this point it may be "too late." And on Twitter, Buzzfeed's Megha Rajagopalan writes that it's "super problematic" to support any government shutting down Facebook.

- - -