Why facial recognition technology is racist, invasive, dangerous
Friday, June 28, 2019 @ 1:11 PM | By Jill Presser
First: facial recognition tools are not very accurate. They misidentify at a very high rate. The most accurate facial recognition tool in 2017, a Chinese tool called Tencent YouTu Lab only accurately identified faces in a test challenge 83.29 per cent of the time. This means that across a large sample, the most accurate facial recognition tool would be wrong about one time in six.
Even as the technology improves, the risk of misidentification remains high because recognizing faces is not a hard empirical science. Unlike identifications based on DNA, which sits on a scientifically tested and validated statistical platform, we do not know the variation across the human population of various facial characteristics. Making identifications based on facial characteristics has not been proven using scientific methods. Facial recognition is more art than science.
This means we risk having wrongful arrests, detentions and even potential wrongful convictions when facial recognition tools are used by law enforcement. However, given some of the privacy-destruction and other concerns arising from this technology, the answer is not to improve its accuracy or get better science about the incidence of facial characteristics. As privacy scholar professor Woodrow Hartzog has noted, facial recognition is harmful when it is inaccurate, and incredibly oppressive the more accurate it gets. The answer is to ban the use of this technology in law enforcement.
The second problem with facial recognition tools is that they are very biased. A number of studies have discovered that most facial recognition tools are more accurate for male faces and for white faces. (see, for example, MIT’s Gender Shades study and Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots).
The darker one’s skin, the more one is likely to be misidentified by facial recognition tools. This means that use by law enforcement of facial recognition tools is likely to result in reinforcing over-representation of non-white individuals, incorrectly, in the justice system.
A third very serious problem is that facial recognition tools in policing depend on the state maintaining massive databases of biometric data about us, including images of our faces, connected to our names and identities.
The state should not be maintaining databases of our biometric data linked to our identities At least not for law enforcement purposes. Our images are uniquely our own. As the Supreme Court recently recognized in R. v. Jarvis 2019 SCC 10, people can have a reasonable expectation of privacy in their own image. The state should not be able to collect and store our faces to potentially then use them against us.
The fourth and perhaps most troubling problem arising from facial recognition tools being used by law enforcement is rights destructive overreach. These tools make possible the constant surveillance of private citizens by the state, everywhere we go in public places, all the time. Facial recognition implies an Orwellian level of surveillance. This would be destructive of any kind of concept of privacy as anonymity in public places. Anonymity was recognized by the Supreme Court of Canada as attracting constitutional protection under s. 8 of the Charter in R. v. Spencer 2014 SCC 43.
Anonymity is an extremely important kind of privacy in a democracy. Professors Hartzog and Evan Selinger call it obscurity, the kind of privacy people enjoy in public places and explain that it is essential for freedom and democracy to flourish (See Why You Can No Longer Get Lost in the Crowd).
The right to be obscure, or anonymous in public, is a precursor to the exercise of other freedoms and rights. We only exercise freedom of association, freedom of expression, freedom of religion, when we know we can do so without fear of state surveillance, oppression or persecution.
Facial recognition tools in the hands of law enforcement threaten to destroy the anonymity or obscurity that is essential to the enjoyment of fundamental rights and freedoms. They threaten the very fabric of civil society and democracy itself.
We must reclaim our right to be just another face in the crowd.
This is part two of a two-part series. Part one: Perils of facial recognition algorithmic tools in policing
Jill R. Presser is a Toronto lawyer practising criminal defence, digital rights and AI law. Contact her at firstname.lastname@example.org.
Photo credit / Lightcome ISTOCKPHOTO.COM
Interested in writing for us? To learn more about how you can add your voice to The Lawyer’s Daily, contact Analysis Editor Peter Carter at email@example.com or call 647-776-6740.