Don’t throw technology at it
By Mark Hurst • September 8, 2023
On the Forum this week we’ve been discussing a story out of Louisville, Kentucky, where there’s a shortage of school bus drivers. With thousands of schoolkids needing a bus ride to school, and not enough drivers, the school district came up with the following solution: throw some technology at it. The district spent $265,000 (source) to hire AlphaRoute, a Boston-based company that touts its MIT connections and its “award winning algorithms [that] are designed to increase efficiencies and provide great customer experience.”
The results, as reported by this AP story (Aug 11, 2023), were less than stellar. The first day of school was a “logistical meltdown” (AP’s words) as the bus routes were horribly inefficient, causing some kids to get home hours late:
[An] appalled parent, Beau Kilpatrick, said one of his young daughters was covered in urine when she finally arrived home at 9:15 p.m. He called it a “complete failure” by the district.
AlphaRoute denies any responsibility for the problem, posting on its homepage that the Louisville schools superintendent “has stated on numerous occasions that this crisis was not caused by our product or our team.” Then it gets even more defensive: “We are not an AI company, and we do not use AI to create bus routes” – as though the “algorithms” mentioned throughout the AlphaRoute website have nothing whatsoever to do with AI. Throughout the message, the company never suggests what, or who, might have been the actual cause of the problem.
I have a pretty good guess, though. There’s a shortage of bus drivers. This is really important to grasp: in the middle of a bus driver shortage, the school district paid hundreds of thousands of dollars for . . . more technology. But the shortage wasn’t technology, it was – if I may just state it again – bus drivers.
This is a teachable moment, so let’s imagine a skit:
Bus manager: Hey, boss, we have a bus driver shortage.
Superintendent: OK, throw technology at it.
BZZZ. Wrong answer.
Let’s try again.
Bus manager: Hey, boss, we have a bus driver shortage.
Superintendent: OK, let’s hire some more bus drivers.
DING DING DING! This is the correct answer. When you have a shortage of X, the right thing to do is get more of X.
Sometimes it seems like Silicon Valley has caused us to collectively lose our minds. To think that in a bus driver shortage we would take $265,000 and hand it to a tech company (though not an AI company, heavens no, an algorithm company) – as if that would solve the problem.
Why not take the $265,000 and – just hear me out – hire more bus drivers? If not enough candidates apply for the $15-or-so an hour wage, raise the hourly rate. Add benefits. Make it a living wage. I have to imagine that $265,000 would put more than a dent in the problem. Yes, someone inhouse might have to design the bus routes, but . . . didn’t schools figure out bus routes without high-tech algorithms for, I dunno, most of the past century?
I actually agree with AlphaRoute that this was not primarily the company’s fault. The cause of the problem, the reason for the “logistical meltdown,” is that our institutions have decided to prioritize technology over everything else – even when the right answer is boneheadedly obvious.
It’s not just school districts. The same thing is happening in remote places like Montana, where it’s really important to detect wildfires early. From a well-written New York Times article by Raymond Zhong (Sep 6, 2023):
The chief of the U.S. Forest Service, Randy Moore, told lawmakers in March that the agency was moving away from humans in watchtowers. The future of fire detection, he said, is cameras. “We need to lean much further into the technology arena,” he said.
Fire detection in the backcountry has always been done by “humans in watchtowers” – people who look for smoke and, when it appears, work to coordinate the fire crews.
Now the Forest Service chief says to throw technology at it. That means, of course, get rid of the people. In their place, install some cameras, some AI (or excuse me, some algorithms) that might or might not detect smoke, and some contractors to watch the footage from hundreds or thousands of miles away.
BZZZ. Wrong answer.
Get Mark Hurst’s weekly writings in email: Subscribe. (Or join Creative Good.)
Sign up for this newsletter.
This is not to say that technology shouldn’t be used in fire detection. Those humans in watchtowers need plenty of tech, after all, to coordinate fire response. But the humans should be in charge. Not the technology. People, not AI-enabled cameras, are our protection against fires. I mean people on the ground, who have experience in the job, not glassy-eyed contractors looking at a dashboard screen in a cubicle somewhere. Fire watching should be done by trained people, treated well. Not AI. It’s sort of like how bus drivers, not algorithmic bus routes, are uniquely able to get kids to school on time.
The false promise of optimization
I spoke this week on Techtonic with Coco Krumme, author of the new book Optimal Illusions: The False Promise of Optimization. Krumme’s premise is that as we get more fixated on optimization – chasing things like efficiency, speed, and “scale” – we can lose sight of what we were trying to do in the first place. We can suffer from “the oppression of a singular way of seeing,” as Krumme puts it. If you listen to the interview, keep in mind that this is happening everywhere – from bus routes to firewatch towers, and many other places as well.
• Listen, but jump straight to the interview
• See episode links and comments
Get Mark Hurst’s weekly writings in email: Subscribe. (Or join Creative Good.)
Sign up for this newsletter.
On the Forum
If the idea of “people, not AI” resonates with you, you should join like-minded people in the Creative Good community. (Join here.) For over two years now we’ve been discussing much, much more on our members-only Creative Good Forum than I can fit into this newsletter.
For example, here are some of the stories that we’re discussing on the Forum this week:
• Elon Musk secretly manipulating the Ukraine war
• How to opt out of data brokers
• A new iPhone alert about Pegasus malware
• The surveillance tech installed in new-model cars
• Google’s antitrust trial
• New (false) claims of AI sentience
• . . . and much more.
That’s all just from the past few days. Join Creative Good and you can begin reading and posting!
(If cost is an issue, drop me a line and we can work something out.)
Until next time,
-mark
Mark Hurst, founder, Creative Good – see official announcement and join as a member
Email: mark@creativegood.com
Listen to my podcast/radio show: techtonic.fm
Subscribe to my email newsletter
Sign up for my to-do list with privacy built in, Good Todo
On Mastodon: @markhurst@mastodon.social
- – -