Throwing away the robot
By Mark Hurst • November 12, 2020
I had a dream last night that I threw away a robot. I was cleaning house before leaving on a trip, and there was a humanoid robot that needed to go. Synthetic flesh, computer chips for a brain, camera lenses for eyes. It felt urgent to deal with the robot before leaving home, because there was just about to be a transformation - some sort of imminent upgrade - that would make the robot even more lifelike, and much harder to distinguish from a real person. And so over the protestations of a friend, I deactivated the robot and placed it in the outgoing trash for pickup.
I'll leave the dream interpretation to any psychiatrists reading this, though I have some ideas: I know, for example, Jung's suggestion that every character in the dream is an aspect of the dreamer himself. Probing that point, however interesting or disturbing, is not my aim today.
Instead, I woke up wondering whether it is possible, at this point, to abandon a technology before it becomes too difficult to distinguish the illusion from reality.
I think my dream was inspired by a couple of Adam Curtis documentaries I watched this week, All Watched Over by Machines of Loving Grace (2011) and HyperNormalisation (2016). Each of the films raises a question about whether we have any control over the development of our technology, especially as we continue to cede more and more of our daily decisions over to the machine. HyperNormalisation opens with Curtis, as narrator, explaining that the movie
is about how, over the past 40 years, politicians, financiers and technological utopians, rather than face up to the real complexities of the world, retreated. Instead, they constructed a simpler version of the world in order to hang on to power. And as this fake world grew, all of us went along with it, because the simplicity was reassuring. Even those who thought they were attacking the system - the radicals, the artists, the musicians, and our whole counterculture - actually became part of the trickery, because they, too, had retreated into the make-believe world, which is why their opposition has no effect and nothing ever changes.
In other words, the merging of Big Tech, Wall Street, and captured politicians has created a system where they "hang on to power" while the rest of us either capitulate, or complain, or tweet, or speak out - it doesn't matter, because "opposition has no effect and nothing ever changes."
I'm reminded of the Facebook boycott organized by Color of Change this past summer, when over 1,000 advertisers pulled all Facebook advertising for the month of July. That's over a thousand companies, places like Starbucks, Pfizer, and Coca-Cola, sending a message to Mark Zuckerberg to clean up his "hate for profit" business model. Zuckerberg's response was, essentially, "they'll be back."
And they did come back. Here's the Wall Street Journal story from October 29: Facebook Posts Record Revenue Despite Ad Boycott: "Revenue jumped 22% to $21.47 billion in the three months through September. . . . The company projects that revenue will grow even faster during the fourth quarter, as the holiday season bolsters ad spending."
Nothing seems to make any difference with Facebook. Public censure blows its horn, then disappears. Financial pressure doesn't even register a blip, as revenue keeps growing. Congresspeople from both parties have scolded Zuckerberg, most recently in the Senate hearings on October 28 (which I covered during my radio show this week), to no effect. After repeated visits to Congress, Zuck is totally at ease under questioning. Nothing makes any difference to him now.
One conclusion of Curtis's HyperNormalisation is that we, the people, have no effective control over the tech systems that increasingly run our society. Our elected representatives have no political leverage over the system, either, as we saw on October 28. But that's not the worst part. The most chilling conclusion is that digital systems give us the illusion of control, so that we feel we're doing something important when we speak out online. Instead, our complaints go nowhere, as the system can amplify or bury speech at will. The only thing that gets measured is "engagement." No matter what we say, it makes the system stronger.
Realizing how this all works can be demoralizing. Evan Selinger posted yesterday: "As a writer, it's devastating to realize that years of criticizing Big Tech has made almost no impact." I responded that there have been some victories against unethical industries, such as the public-health response to cigarette companies, but that it took decades of sustained effort. I know how Selinger feels.
Some have suggested that the recent election portends a new, brighter era. As far as it concerns Big Tech, I'm skeptical: Protocol reports that the presidential transition team is packed with people from Amazon, Sidewalk Labs/Google, LinkedIn/Microsoft, Uber, Lyft, Stripe, Dropbox, and Airbnb, as I posted here. This is along the same lines as what I pointed out in my controversial column in August, which was backed up by this New York Times story a few days later. And then last week, of course, I wrote The winner of the election is... Big Tech.
If Big Tech really is in charge, what does that mean for the rest of us?
One guiding light comes from, of all places, Tolstoy's War and Peace. It's one of my favorite books. While the famously long narrative of over 1,000 pages encompasses a huge sweep of human experience, I think the book mainly asks one question: Who's in charge here? Is it really the emperors, titans, and "big men" that decide how the world works? Tolstoy says no.
The major plot arc focuses on Napoleon - the Zuck of the early 1800s - and his leadership of the French army in its seemingly unstoppable drive toward Moscow. Tolstoy argues forcefully that Napoleon wasn't actually in charge. The battles along the way weren't directed by Napoleon - after all, he hardly had any contact with his generals during the fight, and they in turn had very little influence on what the soldiers did in the trenches. And when (spoiler alert) Napoleon turns back toward Paris, with the Russian army in pursuit, it's not the Russian generals in charge of what happens then. Practically the only two people who grasp the situation are the Russian general Kutuzov, who recognizes his ultimate inability to control outcomes, and the peasant Platon, who accepts and delights in his own weakness.
Things are no less confusing today. Zuck's army seems unstoppable, and we're not yet to the battle of Moscow. But although the fate of Facebook is already pre-ordained - some day Zuck's empire will fail - there's a larger point I need to make. The fallacy that we're being asked to accept today is not the ultimate power of a Napoleon, as Zuck and his PR minions never boast of his omniscient power. Instead the lie is something new. We're being asked to believe in the ultimate wisdom and power of the machine. With enough data, with fast enough processors, with sufficiently advanced AI, we're told, the right decisions will just... emerge. And if we sense that we have no control, not to worry: the AI sees all, understands all, will guide all.
I don't have any answers, except that building a system to "hang on to power" - to use Adam Curtis's phrase - is unethical, destructive, and ultimately unsustainable. Our path through this has to involve something other than faith in big men, or in machines that increasingly give the flesh-and-blood illusion of making human decisions.
Meantime, I'll keep writing this newsletter, and I'll keep adding to Good Reports, spotlighting the rare teams out there actually trying to create good technology. As always, thanks for your support.
Good Reports updates:
I'm happy to announce several updates to Good Reports, my new review site for online tools. (If you missed it, the launch announcement tells the back story.) Here are the latest updates:
• Best web browser (updated two days ago with more on Vivaldi and Firefox)
• Best mobile device (added Librem and PureOS)
• Best videoconference platform (added kMeet)
• Non-toxic social network (added 'why to avoid Twitter' and a warning about MeWe)
Finally, here's my tribute to Alex Trebek (by Nathan Navarro) that I played at the end of my radio show this week. RIP, Alex.
Until next time,
- Mark Hurst
Subscribe to my email newsletter
Sign up for my to-do list with privacy built in, Good Todo
Email: mark@creativegood.com
Twitter: @markhurst
Podcast/radio show: techtonic.fm
- - -