5 Unexpected Places You Can Be Tracked With Facial Recognition Technology
Once people realized that Facebook was basically harvesting biometric data, the usual uproar over the site’s relentless corrosion of privacy ensued. Germany even threatened to sue Facebook for violating German and EU data protection laws and a few other countries are investigating. But facial recognition technology is hardly confined to Facebook — and unlike the social networking site, there’s no “opt-out” of leaving your house.
Post-9/11, many airports and a few cities rushed to install cameras hooked to facial recognition technology, a futuristic apparatus that promised to pick out terrorists and criminals from milling crowds by matching their faces to biometric data in large databases.
Many programs were abandoned a few years later, when it became clear they accomplished little beyond creeping people out. Boston’s Logan Airport scrapped face recognition surveillance after two separate tests showed only a 61.4 percent success rate. When the city of Tampa tried to keep tabs on revelers in the city’s night-club district, the sophisticated technology was bested by people wearing masks and flicking off the cameras.
Human ingenuity aside, most facial recognition software could also be foiled by eyewear, a bad angle or somebody making a weird face. But nothing drives innovation like the promise of government contracts! In the past few years, face recognition technology has advanced substantially, moving from 2-d to 3-d scanning that can capture identifying information about faces even in profile. Another great leap forward, courtesy of Identex (now L-1 Identity Solutions, Inc.), combines geometric face scanning and “skinprint” technology that maps pores, skin texture, scars and other identifying facial marks captured in high-resolution photos.
As face recognition and other biometrics advance, the technology has begun to proliferate in two predictable realms: law enforcement and commerce. Here are 5 places besides Facebook you might encounter face recognition and other biometric technology — not that, for the most part, you would know it if you did.
1. The streets of America
In the fall, police officers from 40 departments will hit the streets armed with the Mobile Offender Recognition and Information System (MORIS) device. The gadget, which attaches to an iPhone, can take an iris scan from 6 inches away, a measure of a person’s face from 5 feet away, or electronic fingerprints, according to Computer vision central. This biometric information can be matched to any database of pictures, including, potentially, one of the largest suppositories of tagged photos in existence: Facebook. The process is almost instant, so no time for a suspect to opt out of supplying law enforcement with a record of their biometric data.
Lee Tien of the Electronic Frontier Foundation told AlterNet that while it’s unclear how individual departments will use the technology, there are two obvious ways it tempts abuse. Since officers don’t have to haul in an unidentified suspect to get their fingerprints, they have more incentive to pull people over, increasing the likelihood of racial profiling. The second danger lurks in the creation and growth of personal information databases. Biometric information is basically worthless to law enforcement unless, for example, the pattern of someone’s iris can be run against a big database full of many people’s irises.
In an extensive report on the MORIS device, Al-Jazeera’s D. Parvaz asked the president of a company that develops facial recognition software how he feels about equipping the government and law enforcement with the technology. He replied (chillingly) “I’m counting on our government being honest, whether it’s law enforcement or the military, trying to find people who threaten our lives.”
But the article highlights an inherent legal problem in the MORIS device, regardless of the no doubt uniformally angelic intentions of law enforcement officials. The 4th Amendment guards against unreasonable searches, including fingerprints. Like a fingerprint, an iris scan reveals identifying information that can’t be gleaned from mere observation. Parvaz’ interview with a member of the Plymouth County Sheriff’s office seems to show that addressing the civil liberties hazards of MORIS are not at the top of law enforcement’s priorities:
John Birtwell, the director of public information and technology at the Plymouth County Sheriff’s Department told Al Jazeera that the county will get “more than a handful … at least three” of the devices.
But that’s just about all the certainty Birtwell had to offer on the topic, as he seemed unclear as to whether officers would inform suspects of their Fourth Amendment rights to refuse to undergo impromptu fingerprinting and iris scanning.
He also seemed unsure as to what the protocol would be in the even that a suspect declined to be processed in such a manner.
“I’m dancing on the head of a pin here because I’m not a constitutional scholar,” said Birtwell.
Other law enforcement officials have more clearly articulated ambitions for the technology — like hunting down undocumented immigrants.
In a June “Fox and Friends” segment on the MORIS device, Sheriff Paul Babeu of Pinal County, Arizona explained his enthusiasm for the new technology. “In Arizona, the illegal immigration issue — we have people from foreign countries, hundreds and hundreds of thousands of them that deliberately have very good documents that are fake, fraudulent, and we need to find out who they are, not only for the safety of my deputies but for the protection of our citizens all across America.”
(“We’ve all heard of racial profiling. Now get ready for what some are calling ‘facial profiling,'” deadpanned “Fox and Friends” host Steve Doocey at the start of the show, completely inadvertently making a very good point.)
It’s important to note that the military has used similar technology in Afghanistan and Iraq for years. One of 20 people in Afghanistan is registered in biometric databases (one of six men of fighting age), according to recent reporting by the New York Times. It’s one in 14 in Iraq (and one in four men of fighting age).
The technology is also being put to use in the aftermath of the London riots, both by law enforcement and an online group assembled to hunt down people involved in the riots by using social networking sites. (London is one of the most heavily surveilled cities in the world.)
2. The DMV
Slightly fewer than half of the DMVs in the US have the capacity to run your picture through biometric databases. Ostensibly, these searches are intended to catch people trying to collect multiple IDs from different states. Fair enough. But as EFF’s Lee Tien told AlterNet, the DMV can also log into and run a person’s face against any government database, including ones that hold criminal records. Last August, former New York Gov. David Paterson and DMV commissioner David Swartz held a triumphant news conference where they announced that more than 100 felony arrests were made through the DMV’s facial recognition program.
In the past, the FBI has applied facial recognition technology to the DMV’s vast database of photo images in pursuit of suspects, according to the AP.
When the California DMV tried to acquire facial recognition technology in 2009, privacy and consumer advocates fought the agency on the grounds that such a massive shift in private data handling required public debate (the DMV had been trying to stealthily strike a deal with the vendor). As SecurityInfoWatch reported at the time, privacy advocates argued that there was no way to ensure the technology would not also be used to track and monitor anyone:
“…. the five-year contract, which is being fast-tracked and could be approved as early as next month, is drawing objections from privacy advocates who fear state and local authorities could use the biometric technology to monitor the movements of ‘innocent people’ — for instance, spectators at a sporting event or an anti-war rally.
‘We see this as sort of creeping Big Brother government, an invasion of people’s privacy,’ said Richard Holober, executive director of the San Mateo-based Consumer Federation of California.”
If facial recognition technology in the hands of the DMV sounds like the makings of someone’s mistaken-identity, Kaftaesque nightmare, it is. The unlucky John H. Gass of Massachusetts had to spend 10 days proving to the Massachusetts DMV that he had not committed ID fraud after facial recognition technology mistakenly flagged his photo because he resembled another man.
3. Las Vegas casinos, and Kraft and Adidas stores
For years Las Vegas casinos have used various forms of facial recognition to identify card-counters. Now, Vegas is at the forefront of efforts to adapt facial recognition to more efficiently suck money out of visitors.
The LA Times reported last week that the Venetian hotel and casino has installed basic facial recognition software in advertisements. A camera captures an image of a person passing by and an algorithm determines their gender and rough age. The advertisement can then present them with products most likely to appeal to their demographic.
Targeted ads are the holy grail of marketing. If you’re an advertiser, you don’t want to waste the priceless real-estate of a teen boy’s brain with an ad for, say, tampons, so advertisers are constantly trying to figure out new ways to deliver the right ads to the right people. Thanks to tools that let companies track web surfing history and the detailed personal information featured on certain giant social networking sites, the digital world provides the best venue for targeted ads.
LA Times reporters Shan Li and David Sarno also got Kraft and Adidas to go on the record about their future plans to install the technology in ads and store kiosks:
“If a retailer can offer the right products quickly, people are more likely to buy something,” said Chris Aubrey, vice president of global retail marketing for Adidas.
Kraft said it’s in talks with a supermarket chain, which it would not identify, to test face-scanning kiosks.
“If it recognizes that there is a female between 25 to 29 standing there, it may surmise that you are more likely to have minor children at home and give suggestions on how to spice up Kraft Macaroni & Cheese for the kids,” said Donald King, the company’s vice president of retail experience.
While these tools divulge very basic personal information, their potential seems limitless. Really, how tough would it be for more sophisticated technology to match a photo to someone’s public Facebook profile, and determine in the process their marriage status, sexuality, hometown, politics, religious beliefs and any number of personality signifiers compiled online, thrusting their digital lives into physical space?
Inevitably, facial recognition software is also being deployed for the purpose of getting people laid. SceneTap, an app developed by a Chicago company uses information from facial recognition cameras planted in bars to determine the ratio of women to men and the average age of customers. As of June, 200 bars across the country had signed up to take part, according to Forbes.
SceneTap developers assured reporters that the cameras they’re installing in bars do not capture high-enough-quality images to match them up to databases or Facebook profiles.
Meanwhile, the East Bay Express reported that bars throughout the Bay Area were actually streaming video to an app called “BarSpace” that lets people check out the bar in real time — so presumably anyone with an iPhone could easily check where you are and who you’re drunkenly flirting with without you knowing it. The investigation found that most bar patrons are not aware they’re being filmed. Is an app wedding SceneTap’s face recognition technology to BarSpace coming down the pike?
This is not the first time biometric tools have invaded bars. In 2006, a program called BioBouncer let bouncers take pictures of incoming patrons and scan them against a database to pick out troublemakers. Bar owners shared a large database of information. According to the company behind the technology, information about law-abiding bar patrons would get dumped at the end of the night, reported Wired. Of course, there was no way to guarantee that indefinitely. Or guarantee that bar owners wouldn’t share the info with the police, or with private investigators, or with data collection companies, as security expert Bob Schneier pointed out at the time.
5. All of Japan
As far as commercial uses of facial recognition technology, Japan is way ahead of the curve. So here are some things we may be looking forward to:
a) Vending machines: Japanese vending machines suggest soft drinks based on stereotypes based on your gender and age (and the weather).
b) Billboards: Japanese billboards contain technology that figures out a person’s sex and age to within 10 years, and presents them with the appropriate advertising.
c) Truck stops: A truck stop uses facial recognition to gauge the alertness of drivers.
d) Hotels and restaurants: NTDtv reports Omron, a Japanese technology company, equips hotels and restaurants with the technology to let them flag VIP guests.
e) Service work: According to Reuters, Omron also uses a “smile-scan” allowing service companies to ensure their employees evince the appropriate levels of enthusiasm on the job.
While there’s nothing inherently wrong with advances in biometrics, there are also no inherent limits for its use and abuse, as EFF’s Tien points out. So it’s important to always ask who’s controlling the cameras and the databases, and for what purpose.