3.1 C
New York

Rite Aid Faces AI Facial Recognition Ban After Falsely Tagging POC

Published:


Rite Aid will be banned from using AI-powered facial recognition technology for five years as it disproportionately impacts people of color.

The US pharmacy chain deployed AI-based facial recognition technology from 2012 to 2020 to identify shoplifters.

However, the Federal Trade Commission (FTC) filed a complaint against the company.

Rite Aid’s Facial Recognition Tech

According to the FTC, Rite Aid has used facial recognition technology in hundreds of its retail pharmacy locations to identify patrons previously deemed likely to engage in shoplifting or other criminal behavior.

The technology captured images of all consumers in its drugstores and created a database of those identified as carrying out suspicious behavior.

The database included accompanying information, such as names, birth years, and criminal or dishonest behavior details.

The technology generated alerts sent to Rite Aid’s employees, indicating that individuals who had entered Rite Aid stores were matches for entries in Rite Aid’s watchlist database.

In numerous instances, FTC alleged that the match alerts led to increased surveillance and public accusations that were false positives.

The Effects On People Of Color

“Rite Aid failed to take reasonable measures to prevent harm to consumers from its use of facial recognition technology,” the complaint stated.

“Rite Aid’s failures caused and were likely to cause substantial injury to consumers, and especially to Black, Asian, Latino and women consumers.”

It stated that although 80% of Rite Aid stores are in white areas, about 60% of the stores that used facial recognition technology were in non-white areas.

As a result, store patrons in Black, Asian, and Latino areas were more likely to be subjected to and surveilled by the technology.

This adds to the fact that many currently available facial recognition technologies produce more false-positive matches for Black or Asian image subjects compared to white image subjects.

Match alerts occurring in stores located in areas where the plurality of the population was Black or Asian were significantly more likely to have low confidence scores than in stores located in white areas.

What Is The Solution?

The FTC’s proposed order would require Rite Aid to implement comprehensive safeguards to prevent future customer harm.

It would require Rite Aid to stop using such technology and delete and direct third parties to remove any images or photos that have been collected.

Rite Aid, which is going through bankruptcy proceedings, will have the order going into effect after approval from the courts.

“We are pleased to reach an agreement with the FTC and put this matter behind us,” Rite Aid said in a statement.

“However, we fundamentally disagree with the facial recognition allegations in the agency’s complaint. Rite Aid stopped using the technology in this small group of stores more than three years ago.”


Feature Image Credit: Reuters


#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails

Source link

Coffistop Media
Coffistop Mediahttps://coffistop.com
Consolidated platform for African American bloggers, YouTubers, writers, foodies, travelers, athletes and much more. One platform endless flavor.

Related articles

Recent articles