A simple tech ethic
By Mark Hurst • July 23, 2020
In response to my recent columns about Big Tech's misbehavior, a few readers have asked: what are teams supposed to do? Is there some new ethical framework that we need to follow, to ensure that our technology has good ethics baked-in?
I do have a suggestion, and it's not to go off and enroll in a long course in ethics, or launch a new initiative to "become ethical" - as though one could add ethics, like a pinch of salt, to a product. Besides, talking about ethics apart from the rest of the company's activities almost makes it look as though "doing right" is something optional that may, or may not, be added in later.
I don't think ethics is an add-on option. Instead, if you want to create products that are good, ethical, whatever your goal is - you need to have internalized some sort of approach, as part of your fundamental outlook, that imbues everything you do.
Maybe you have your own ethical code for building products. My own is pretty simple to describe, and it goes something like this:
Build something that acts in people's long-term best interest. That means, create something good (and yes, I'm making reference to my company name, as I named it with this very concept in mind). Don't overcomplicate the philosophical implications of the word "good." Something that's useful or helpful, as determined by the outcomes it has on the lives of users and their communities.
Don't cheat. Don't use deceptive "dark patterns" to nudge someone into acting in a way that harms their long-term interest. Don't bury hidden "gotchas" in the Terms and Conditions. Don't promise one thing but deliver another: that's cheating and lying. Don't take something (users' personal data, say) and do something with it that they didn't expect, and would be horrified to find out that you're doing with it (selling to data brokers, say). That counts as cheating, lying, and stealing, and incidentally is also the profit engine of Facebook and Google.
Build a product that you would recommend to your own family and friends. This isn't the full set of people to be concerned with (see next point), but it's a good baseline as an empathy check. If you're building something that you'd be embarrassed to present to your own relationships, that's a red flag.
Widen your scope beyond the features themselves. Consider how the product acts in the world, what effects it has on people's lives, what secondary effects or consequences it might bring about. Widen your scope...
- in time (beyond the immediate future to the long term)
- in causality (beyond first-order effects to other consequences)
- in geography (beyond your local region, and beyond the urban environment you might be accustomed to)
- in the social graph (beyond people or types of people you already know - that is, beyond your own family, neighbors, and social cohort)
- in the organization (beyond your individual product team or business unit to consider the whole enterprise, which on a global level may be dependent on a toxic business model - see Facebook and Google above)
Listen to users to find out what they want BEFORE you build it. This suggestion flies in the face of the Lean Startup model. Lean says to build it first (as per the wishes of investors, who prefer to play roulette by launching a million things at once) and then use the launch itself as the first opportunity for real customer input. I sincerely hope that we are passing that phase, now, as the pandemic forces teams, organizations, really everyone to use their dwindling resources more wisely. (I wrote an entire book - Customers Included - with lots of facepalm examples of companies that "built first, asked later" and wasted hundreds of millions of dollars.) I can't emphasize this enough: you should be building what what will benefit users in the long term, and you can find that out by listening to users before you build it.
If your company doesn't listen, quit your job - if you can. Former Amazon distinguished engineer Tim Bray quit his job at Amazon and was just profiled yesterday in the New York Times. Stacy Mitchell quit her job today from a group at Yale after the director revealed having been paid (off) by Big Tech. I did a whole show on whistleblowers in tech last November, featuring people like Jack Poulson, who quit his job at Google when he saw unethical behavior at the highest levels of the company. Many people can't afford to quit their jobs - and others prefer to organize within the company - but whatever you choose to do, at least consider your options.
Overall, the approach is simple: Work for the long-term benefit of users and communities. Consider the outcomes. And don't cheat. If we had a tech industry that followed that ethical approach, just imagine how different the world would be.
Until next time,
- - -