Facial recognition is already here
By Mark Hurst • July 11, 2019

An old Twilight Zone episode begins with a button, affixed to a mysterious box, showing up on a couple's doorstep. The deal is simple: if they press the button, they'll get a huge sum of cash - with the catch that, as a result, someone who they've never met will die.

That show came to mind as I spoke with scholar Chris Gilliard this week about the effects of surveillance on marginalized communities. It's worth paying attention to how the surveillance state is being installed in vulnerable populations. Sooner or later, it will come to your community as well.

Click here to listen to my interview with Chris Gilliard from my radio show Techtonic this week.

"Luxury surveillance" is how Chris describes Amazon Echos, "smart" watches, FitBits, and other Internet of Things devices. The result, he says, is a growing surveillance state in which the well-off are slowly habituated to the same technology that oppresses the less fortunate. As Chris puts it in this Fast Company essay:

As one group pays to be watched, other groups continue to pay the price for being watched.

A good example of this disparity is described in Digital Jail: How Electronic Monitoring Drives Defendants Into Debt (NYT, July 3) by past Techtonic guest Ava Kofman:

Ankle bracelets are promoted as a humane alternative to jail. But private companies charge defendants hundreds of dollars a month to wear the surveillance devices . . . that sometimes cost more than their bail. And unlike bail, they don't get the payment back, even if they're found innocent.

Amazing, if you think about it: Part of our society pays to wear surveillance devices as luxury. Another part of our society pays to wear surveillance devices as punishment - even if they're innocent. Both are essentially the same technology. And either way, Big Tech companies get paid.

In the midst of all this are the product managers, UX designers, and engineers hired to design and deliver surveillance technology. No one wants to take responsibility, but - let's state the uncomfortable truth - all of them are complicit: the company employees who get paid to design the tech, the investors who profit from the companies, and the wealthy customers who pay luxury prices to fund the growth of the surveillance state. The only people not complicit are the vulnerable communities who have these things forced upon them.

(Really, listen to the interview. Read more on Chris Gilliard: @hypervisible, hypervisible.com)

And if you think you're somehow protected from all this surveillance...

Your face has already been surveilled

You may not know it, and you certainly didn't agree to it, but your face is most likely already in a facial-recognition database, if you live in the US. From a Washington Post article (July 7) that has gotten a lot of attention:

Agents with the Federal Bureau of Investigation and Immigration and Customs Enforcement have turned state driver's license databases into a facial-recognition gold mine, scanning through millions of Americans' photos without their knowledge or consent.

This came from research at the Georgetown Law Center on Privacy & Technology, which in May published AmericaUnderWatch.com, a special report on facial recognition in the US.

The upshot: Facial recognition is already here. And your face is already being surveilled, analyzed, and used in ways you didn't agree to.

Are you OK with that precedent?

I ask about precedent because this is growing, all around us, right now. The ICE/FBI usage of driver's license images is just one of many, many examples of surveillance powered by the tech industry.

As Sidney Fussell put it in the Atlantic last month, "these may be the last days of privately owning our own faces." (From The Strange Politics of Facial Recognition, June 28.)

And that's just the beginning. Read on.

Recent facial-recognition news

Police departments across the US are using video doorbells from Amazon-owned Ring to create an unofficial surveillance network (Business Insider, June 7): "Police departments across the US have partnered with Amazon and its subsidiary, Ring, to offer programs for free or discounted Ring smart doorbell devices to their residents. Some police departments added their own conditions to the programs that allow them to obtain recorded footage from a Ring device upon request." See also the CNet story.

Amazon Is Watching (Will Oremus, June 27): "Rekognition [is] a platform that uses machine learning to analyze images and video footage. Among other features, Rekognition offers the ability to match faces found in video recordings to a collection of faces in a database, as well as facial analysis technology that can pick out facial features and expressions." See also: IBM Used NYPD Surveillance Footage To Develop Technology That Lets Police Search By Skin Color (The Intercept, Sept 6, 2018)

Amazon's next big thing may redefine big (BBC, June 15) "[Amazon's chief technology officer] Werner Vogels doesn't feel it's Amazon's responsibility to make sure Rekognition is used accurately or ethically. 'That's not my decision to make.'" Notice: Big Tech companies profiting from surveillance tech but refusing to take any responsibility for it.

The Racist History Behind Facial Recognition (Sahil Chinoy in the NYT, July 10): "The temptation to think we can read something deeper from visual stereotypes is misguided - but persistent."

Why airport face scans are a privacy trap (Geoffrey Fowler in the Washington Post, June 10): "What's face recognition at the airport really about? Immigration policy and efficiency." By the way, the airport facial-rec system was promptly breached.

Facial Recognition Coming To Delta Gates At MSP (CBS, June 20): "Delta Air Lines announced it will give passengers who fly out of Minneapolis-St. Paul International Airport the option to use facial recognition to board their flight instead of a standard boarding pass."

The First Public Schools In The US Will Start Using Facial Recognition Next Week (Davey Alba in Buzzfeed News, May 29): "Testing of the Aegis [facial recognition] system begins in the Lockport City School District [in New York State] next week." This was followed by Legislation to suspend facial recognition in schools passes New York State Assembly (June 20).

San Francisco Bans Facial Recognition Technology (NYT, May 14): Note that companies like Google, Facebook, and Amazon are exempted from the ban.

Why You Can No Longer Get Lost in the Crowd (Woodrow Hartzog and Evan Selinger in a NYT op-ed, April 17): "Facial recognition technology poses an immense danger to society because it can be used to overcome biological constraints on how many individuals anyone can recognize in real time. If its use continues to grow and the right regulations aren't instituted, we might lose the ability to go out in public without being recognized by the police, our neighbors and corporations."

Duke MTMC is a dataset of surveillance camera footage of students on Duke University campus: "Duke MTMC (Multi-Target, Multi-Camera) is a dataset of surveillance video footage taken on Duke University's campus in 2014 and is used for research and development of video tracking systems, person re-identification, and low-resolution facial recognition." See also this story on the project. And see Chris Gilliard's comment.

Brainwash is a dataset of webcam images taken from the Brainwash Cafe in San Francisco: "The Brainwash dataset is unique because it uses images from a publicly available webcam that records people inside a privately owned business without their consent. No ordinary cafe customer could ever suspect that their image would end up in dataset used for surveillance research and development, but that is exactly what happened to customers at Brainwash Cafe in San Francisco. Although Brainwash appears to be a less popular dataset, it was notably used in 2016 and 2017 by researchers affiliated with the National University of Defense Technology in China for two research projects on advancing the capabilities of object detection to more accurately isolate the target region in an image."

Microsoft deletes massive 'MC Celeb' facial recognition data set (June 8): "Many of the faces included in the data were not those of public figures or celebrities. Indeed, security journalists and privacy advocates were among those included, such as Shoshana Zuboff, author of [The Age of] Surveillance Capitalism."

It's not just facial recognition

Surveillance is, of course, spreading much more widely than just facial recognition.

Google uses 3rd-party contractors to listen into your conversations via the Google Home surveillance device (VRT NWS, July 10). Watch the video at the top of the article.

An American organization founded by Google & IBM is working with a company that is helping China's authoritarian government secretly monitor the phone & internet activity of 200 million people, reports Ryan Gallagher (The Intercept, today, July 11).

WSJ article (June 20) on biometric technology in the home, what Chris & I discuss in the interview above as "luxury surveillance." The articles states the tech is "expanding to every corner of the home, using body identifiers to open the door, say hello, unlock the wine cellar and reveal the screening room." Yes: everyone needs facial recognition for their wine cellar!

When Machine Learning is Facially Invalid (Frank Pasquale - a past guest - in the Communications of the ACM, Sept 2018)

Facebook lawyer argues you should have 'no expectation of privacy' (Graham Cluley, June 3): Facebook counsel Orin Snyder argued, "There is no invasion of privacy at all, because there is no privacy." See also the NYT story.

About the Google Nest surveillance device: "The always-on connected home Internet camera you paid money to have installed in your thermostat may also be used to film you!" (Maciej Ceglowski, July 10)

Thread by @halhod (June 28): "Technology is eroding one of the great levees of human society - the ability to move around the physical world anonymously."

Techtonic radio show guests on surveillance

My most recent two Techtonic radio guests have spoken about surveillance:

Chris Gilliard, as described above, talked about the effects of surveillance on marginalized communities - and all of us. (See show notes, listen to entire show, or jump to interview.)

Scott Urban creates "privacy eyewear," called Reflectacles, which block the infrared beams from facial-recognition cameras. (See show notes, listen to entire show, or jump to interview.

• Podcast: Subscribe here to the Techtonic podcast - same audio as the radio show, but in a convenient podcast format for downloading.

If you read all the way to the bottom (thanks!): Hire me to help your product team create actually good, non-evil, non-exploitative experiences for your users. I can also give keynote presentations at events.

-Mark Hurst

- - -

To share this column, retweet this, or copy and paste this:

Facial recognition is already here, writes @markhurst:
https://creativegood.com/blog/19/facial-recognition-is-here.html

- - -