“OMG, you guys, it’s Burpy Snorkleface!” your friend exclaims (at least that’s what it sounds like). We’ve all been there: Someone’s excited about a famous-person sighting or the latest morsel of celebrity gossip … and you have no idea who it is.
Who cares, right? It’s easy to remedy and has virtually no consequences. It’s not like you’re going to lose your job or something. Unless you’re Lauren Twisselman, an employee at a Los Angeles boutique who didn’t recognize actress Mindy Kaling when she popped into the store. Twisselman wasn’t fired, but she sure could’ve used celebrity facial recognition software, suggested NPR recently. British company NEC IT Solutions has created “VIP-identification software” to help employees such as Twisselman spot and serve celeb shoppers. The software is modeled after the company’s existing criminal and terrorist facial recognition tools. (The technology is already being tested in at least 10 hotels and high-end stores.) But using facial recognition technology on an unsuspecting public when reducing danger isn’t the main concern raises some ethical hackles.
First, how’s it work? According to NPR:
If a face is a match, the program sends an alert to staff via computer, iPad or smartphone, providing details like dress size, favorite buys or shopping history.
The software works even when people are wearing sunglasses, hats and scarves. Recent tests have found that facial hair, aging or changes in weight or hair color do not affect the accuracy of the system.
So be afraid, Real Housewives of Lithuania — gaining weight, hiding behind cantaloupe-sized sunglasses, or pulling an Amanda Bynes can’t stop retail associates from chasing you down with this season’s Prada handbag. The idea sounds useful and almost cute at first. After all, no one with a life can know every celebrity, and retailers shouldn’t be expected to rush home from a day of refolding wadded-up clothes and plop down with a stack of flashcards and a DVR full of all the latest shows.
A culture of fame
Plus, celebs profit handsomely from their fame. People like Paris Hilton and Kim Kardashian seem to have no talent whatsoever, being paid instead for party appearances and maintaining their empires based on being recognized. The recent media orgasm over the royal baby — as seen in entire articles devoted to his name, his star sign and whether he’ll be circumcised — are weary proof that some people don’t even have privacy as a fetus. Why should people whose success, in part, is measured by how much airtime and how many magazine covers they land get some illusion of privacy?
And technically, such video surveillance is legal. According to New Media Rights, “You can legally record video without consent, unless it violates the privacy rights of others,” such as in bathrooms, dressing rooms, bedrooms and other places where you expect more privacy than usual.
Useful for us plebeians
Facial recognition software in a threat-free context can even be useful for us non-famous consumers, not just businesses. Finnish startup Uniqul is toying with a “pay by face” system that would connect your credit card and PIN to your facial features, and security company Diebold designed an ATM that authenticates users based on their face. Any seconds spared punching in your PIN are seconds you could spend, hypothetically, sleeping or watching a “Veronica Mars” marathon.
At first glance, Americans seem to vastly support public facial recognition software: “79 percent of Americans are in favor of using facial recognition at various locations and public events, and 81 percent support expanded camera surveillance on streets and in public places,” Biometric Update reports. However, the original poll question stipulated that the facial recognition was to detect suspected terrorists — not, say, help a business pick you out as a former child star and foist your favorite color scarf onto you.
Hackers and other concerns
Facial recognition is far from hacker-proof too. Apps like FastAccess Anywhere that let you unlock your smartphone or computer with your face can be rendered useless if someone has a photo of you and a little time to guess your “secret shape.” And against Google’s policy, a hacker recently managed to install facial recognition software on Google Glass. Such powerful technology in the hands of ill-intentioned people is the stuff dystopian action movies are made of.
Ultimately, analyzing shoppers’ faces without their knowledge or consent, then matching them with a database, seems creepy and Big Brother-esque. Privacy is an increasingly elusive commodity. As The Atlantic wrote of facial recognition software in 2011:
No matter what you choose to do or not do, your life exists in the cloud, indexed by Google, in the background of a photo album on Facebook, and across thousands of spammy directories that somehow know where you live and where you went to high school … You may be able to change your name, … but you can’t change your face. And the cloud never forgets a face.
These concerns are echoed slightly less ominously by Jennifer Lynch of the Electronic Frontier Foundation. Although she said facial recognition can help solve crimes, it “can also perpetuate racial and ethnic profiling, social stigma, and inaccuracies throughout all systems and can allow for government tracking and surveillance on a level not before possible,” Lynch testified before the Senate. Human rights nonprofit Witness suggests in a blog post that facial recognition could be a tool for supporting human rights, but that its lack of regulation and silence from technology companies is extremely worrisome. “We can — and must — all do better to make the best of this powerful new technology — and to protect us all from the worst,” Witness urges.
Helen Nissenbaum, in particular, took a strong stance on privacy in public back in 1998. P. Brey writes in a 2004 piece in the Journal of Information, Communication & Ethics in Society, “Helen Nissenbaum has argued that even if the expectation of privacy is diminished in public places, people still have justifiable privacy expectations even when they are in public. She argues that surveillance in public places that involves the electronic collection, storage, and analysis of information on a large scale often amounts to a violation of personal privacy.” As Brey writes, simply going out in public is a far cry from volunteering to be part of a police lineup — but facial recognition in public gives people no choice.
Ethical guidelines going forward
Disclosing that your business is using facial recognition software, particularly when it’s not to make anyone safer, seems essential. In a piece on the ethics of face recognition in retail, Carl Gohringer of Allevate Limited suggests it’s only the first step. “As part of the process of informing, organizations should also be direct and open in disclosing not only the existence of the systems, but the scope, intent, and purpose of the solutions,” Gohringer writes. “Why are you utilizing an individual’s biometric data? What benefit does it serve? What is the scope of the use of this data?”
Thankfully, the Federal Trade Commission, too, has proposed some ethical guidelines for companies employing facial recognition:
First, companies should maintain reasonable data security protections for consumers’ images. … Companies that store such images should consider putting protections in place that would prevent unauthorized scraping, which can lead to unintended secondary uses …
[C]ompanies using digital signs capable of demographic detection — which often look no different than digital signs that do not contain cameras — should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs …
[Further,] companies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer’s affirmative express consent.
Oh, interesting. Because in that case, Lauren Twisselman will have to learn to recognize Mindy Kaling the way the rest of us do: by watching “The Mindy Project.”