Short Wave: Tech Companies Are Limiting Police Use of Facial Recognition. Here's Why

Earlier this month, IBM said it was getting out of the facial recognition business.

Then Amazon and Microsoft announced prohibitions on law enforcement using their facial recognition tech. Nationwide protests have opened the door for a conversation around how these systems should be used by police, amid growing evidence of gender and racial bias baked into the algorithms.

Today on the show, Short Wave host Maddie Sofia and reporter Emily Kwong speak with AI policy analyst Mutale Nkonde about algorithmic bias — how facial recognition software can discriminate and reflect the biases of society.

Nkonde is the CEO of AI For the People, fellow at the Berkman Klein Center for Internet & Society at Harvard University, and Fellow at the Digital Civil Society Lab at Stanford University.

Articles mentioned in this episode:

NPR’s Bobby Allyn’s reporting on IBM and Amazon halting police use of facial recognition technology

Joy Buolamwini and Timnit Gebru’s 2018 MIT research project “Gender Shades

The National Institute of Standards and Technology’s (NIST) 2019 study “Facial Recognition Vendor Test, Part 3: Demographic Effects

Email the show at shortwave@npr.org.

This episode was produced by Brit Hanson, fact-checked by Berly McCoy, and edited by Viet Le.

 

Short Wave Podcast

It’s science for everyone, using a lot of creativity and a little humor. Join host Maddie Sofia for science on a different wavelength.


Technology Topics
Facial Recognition System
Social Studies Topics
Ethics, Government and Political Science
Middle School, High School
7th Grade, 8th Grade, 9th Grade, 10th Grade, 11th Grade, 12th Grade, Adults

What are you looking for?

Organization

Short Wave (NPR)

Website URL

Type of Resource

Podcast

Assigned Categories