Facial recognition technologies and racial bias - Activate World

Facial recognition technologies and racial bias

The use of facial recognition by law enforcement to identify suspects raises concerns because of high inaccuracy rates and striking racial disparities in how the algorithms can correctly identify people. For example, the darker the skin, the higher the incidence of errors – up to nearly 35 percent for images of darker-skinned women, according to a study last year by Joy Buolamwini, a researcher at the M.I.T. Media Lab. When the person in the photo is a white man, the software is right 99 percent of the time.

Lunchtime Conversations:

  • According to the Washington Post, the FBI and the ICE are using twenty-one state driver’s license databases for facial recognition as the foundation for a massive surveillance infrastructure. Images of millions of Americans are being used without their knowledge or consent. Why is this happening without the approval of state legislators or individual license holders?
  • In cities such as Baltimore, police have used facial recognition software to identify and arrest individuals at protests. How can regulators and legislators ensure that use of facial recognition technologies balance public safety with residents’ privacy rights? 
  • For biased AI systems, should we be asking: who has the power? Who is harmed? Who benefits? And ultimately, who gets to decide how these tools are built and for what purpose?

Noteworthy:

  • The National Institute of Standards and Technology (NIST) said last year that the best algorithms got 25 times better at finding a person in a large database between 2010 and 2018. That’s helped drive widespread use of facial recognition in government, commerce, and devices like the iPhone.
  • Despite the growing debate, facial recognition is already embedded in many federal, state, and local government agencies, and it’s spreading. The US government uses facial recognition for tasks like border checks and finding undocumented immigrants.
  • Some civil liberties advocates, lawmakers, and policy experts want government use of the technology to be restricted or banned, as it was recently in San Francisco, Somerville, MA, and Oakland, CA. Their concerns include privacy risks, the balance of power between citizens and the state—and racial disparities in results.

Storyline:

Recent studies show how some of the biases in the real world can seep into artificial intelligence (A.I.), the computer systems that inform facial recognition. Data defines modern artificial intelligence, given that A.I. software is only as smart as the data used to train it. If there are many more white men than black women in the system, it will be worse at identifying black women.

French company Idemia’s facial recognition software serves police in the US, Australia, and France, scanning faces by the millions. In 2017, a senior FBI official told Congress that a facial recognition system that analyzes 30 million faces using Idemia technology helps “safeguard the American people.”

However, July test results from NIST indicated that two of Idemia’s latest algorithms were significantly more likely to mix up black women’s faces than those of white women, or black or white men.

The NIST test required algorithms to verify that two photos showed the same face, as a border agent would do when checking passports. At sensitivity settings where Idemia’s algorithms falsely matched different white women’s faces at a rate of one in 10,000, it falsely matched black women’s faces about once in 1,000—or 10 times more frequently. 

Despite advances in the technology, NIST’s tests and other studies repeatedly have found that the algorithms have a harder time recognizing people with darker skin. The agency’s July report covered tests on code from more than 50 companies. Many top performers in that report show similar performance gaps to Idemia’s 10-fold difference in error rate for black and white women. 

Sources: The New York TimesThe Wall Street JournalThe Washington PostWiredVox/Recode, Vox

Photo by Rob Sarmiento on Unsplash