Our government is moving toward facial recognition technology, but where will it take us?
Recently, U.S. Customs and Border Protection unveiled the latest tool in their evolving toolkit to identify and stop criminals and terrorists at the border. Their new facial comparison biometric system looks at the actual traveler and compares it to the passport photo. In this case, a 26-year-old man pretending to be French and traveling from Brazil didn’t make it into the country. Instead, his actual Republic of Congo identification was found hidden in his shoe.
Score one for the good guys, right? Sort of. The initial premise is very interesting. It’s comparison, not recognition. This helps avoid a lot of the privacy pitfalls with any kind of a national facial recognition database. Except you have to read to the very end of the CBP press release to see what is buried at the bottom.
{mosads}“CBP is assessing the use of biometric technology as part of a future end-to-end process, from check-in to departure, in which travelers use biometrics instead of their boarding pass or ID throughout the security and boarding processes.”
If this sounds familiar, it has its roots in modern smart phone technology and the Apple iPhone X. Face ID is the feature that lets you forget about your password (which many of us just plain forget) and use your face to log in. Except the phone, and Apple, have been accused of racism by numerous Chinese consumers.
One report describes how a husband in Shanghai bought his wife the new iPhone X, only to discover it could be un locked by their son. In another case, a Chinese woman from Nanjing returned one phone after she found out her female colleague could also unlock it. Even after her money back and a second phone, the result was still the same.
Is it possible for machines—inanimate objects—to discriminate? There are numerous college courses in computer science and machine learning that are attempting to answer that question. At UC Berkley, a Fall 2017 course was entitled “Fairness in Machine Learning”. Some of the topics include “Hand Chapter 4, failure points of measurement” and “Critiques of algorithmic decision making.”
Algorithms are only as impartial as the developers are. This is not to say developers are inherently racist or biased, but sometimes you don’t know what you don’t know.
Sometimes the algorithms and technology are simply ineffective. A recent report from the South Wales Police in the United Kingdom showed an astounding 92% failure rate in facial recognition. The technology was deployed during the Union of European Football Associations Champions League Final during June 2017.
Out of a database of over 500,000 images, and 170,000 attendees, 2,470 potential matches were found. It sounds like a lot of hooligans were attending until you get the actual number correctly identified—173. Even AAA baseball players tend to have a better batting average.
The UK isn’t the only Commonwealth with troubles. In the run-up to the Commonwealth Games on the Australian Gold Coast in April 2018, Queensland’s privacy commissioner took issue with the proposed use of facial recognition technology to “pick out suspected terrorists.” He compared the level of intrusion to how China and Russia routinely ignore privacy concerns.
Games Chairman Peter Beattie said he “totally supported temporary police search powers that in another life 30 years ago, I would have been out protesting against.” Why? Beattie replied “The reality is Osama bin Laden changed the world about privacy. As a lawyer, I can tell you I value personal freedoms, people’s civil liberties, but they’re better off alive than dead. So if you want to get on [with] a backpack on a train in Roma Street, you are very likely to be searched before you get to Helensvale, and I make no apology for that.”
That concern isn’t stopping China from assembling a facial recognition database of their 1.3 billion citizens. Their goal is to identify any citizen within seconds at a 90 percent accuracy rate. The sheer magnitude of the problems that need to be solved will keep this goal from becoming a reality for years. Some of the researchers said the development was “encountering many difficulties due to the technical limits of facial recognition technology and the large population base.”
China says the system is being developed for security and government uses, like tracking wanted suspects and for public administration. But that isn’t exactly what it appears to be. Under ‘public administration’, it also includes monitoring your use of toilet paper.
“Some public lavatories in Beijing also use facial recognition so that the automatic dispensing machines will deny toilet paper to people who ask for it more than once within a given period.” I’m not sure I want to live in a country that watches me in the lavatory, much less the bedroom.
I have no doubt the CBP isn’t interested in monitoring the bathrooms, but the underlying issues are the same. While there are 1.3 billion people in China, a record 965 million people flew on foreign and domestic airlines in 2017 according to the US Department of Transportation. Which brings me back to another paragraph buried at the end of the CBP press release.
“CBP is committed to its privacy obligations and has taken steps to safeguard the privacy of all travelers. CBP published several Privacy Impact Assessments, employs strong technical security safeguards, and limited the amount of personally identifiable information used in the new biometric process.”
Facial recognition has yet to become a proven technology. It’s still replete with false positives, racial and cultural drawbacks, and fragmented privacy policies. The question isn’t whether the technology will ever get there—it will. The question is do we want to be like China?
Morgan Wright is an expert on cybersecurity strategy, cyberterrorism, identity theft and privacy. He previously worked as a senior advisor in the U.S. State Department Antiterrorism Assistance Program and as senior law enforcement advisor for the 2012 Republican National Convention. Follow him on Twitter @morganwright_us.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..