Wire stories

Researchers say Amazon face-detection technology shows bias

Associated Press
By Associated Press
1 Min Read Jan. 25, 2019 | 7 years Ago
Go Ad-Free today

NEW YORK — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Ama­zon to stop marketing its Rekog­nition service because of worries about discrimination against minorities. Some Amazon investors have asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.

The researchers said that in their tests, Amazon’s technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time. Darker-­skinned men had a 1 percent error rate, while lighter-­skinned men had none.

Artificial intelligence can mimic the biases of their human creators as they make their way into everyday life. The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology.

Matt Wood, general manager of artificial intelligence with Amazon’s cloud-computing unit, said Amazon has updated its technology since the study and done its own analysis with “zero false positive matches.”

Share

Tags:

About the Writers

Push Notifications

Get news alerts first, right in your browser.

Enable Notifications

Content you may have missed

Enjoy TribLIVE, Uninterrupted.

Support our journalism and get an ad-free experience on all your devices.

  • TribLIVE AdFree Monthly

    • Unlimited ad-free articles
    • Pay just $4.99 for your first month
  • TribLIVE AdFree Annually BEST VALUE

    • Unlimited ad-free articles
    • Billed annually, $49.99 for the first year
    • Save 50% on your first year
Get Ad-Free Access Now View other subscription options