Busted: Rite Aid Deployed Facial Recognition Technology Disproportionately In Black Neighborhoods

Written by Ann Brown
Busted: Pharmacy chain Rite Aid deployed facial recognition technology disproportionately in Black and poor neighborhoods. Image: Mike MacKenzie/Flickr/Creative Commons

Tristan Jackson-Stankunas has experienced firsthand the racial bias of facial recognition technology.

A customer at Rite Aid, Jackson-Stankunas was accused of being a shoplifter in a Los Angeles store based on someone else’s photo. While shopping for air freshener in September 2016, he was ordered by a manager to leave the store. The manager told him he had received a security image of Jackson-Stankunas taken at another Rite Aid in 2013 from which he allegedly had stolen goods.

Jackson-Stankunas saw the photo on the manager’s phone and told Reuters he had nothing in common with the person except their race: Both are Black.

“The guy looks nothing like me,” Jackson-Stankunas said. He was ultimately allowed to make his purchase and leave the store. Rite Aid “only identified me because I was a person of color. That’s it,” he said.

Jackson-Stankunas, 34, filed a complaint with the California Department of Consumer Affairs after the incident.

If you ever shopped at a Rite Aid and felt you were being watched, you might just have been. And it wasn’t by a physical security guard. It could have been by a facial recognition system the pharmacy chain quietly set up in hundreds of U.S. stores — those located in largely lower-income, non-white neighborhoods.

The retailer deployed DeepCam, a facial recognition system. Since Reuters launched an investigation into the technology, the company said it had ended the surveillance program.

In its New York and metro Los Angeles stores, Rite Aid installed facial recognition technology including a state-of-the-art system from a company with links to China, Reuters reported.

This wasn’t something new. The company used the technology for more than eight years in 200 stores across the U.S. according to a Reuters investigation launched in February. “And for more than a year, the retailer used state-of-the-art facial recognition technology from a company with links to China and its authoritarian government,” Reuters reported.

Rite Aid confirmed to Reuters the existence and breadth of its facial recognition program. It also defended the use of the technology, saying it was intended to deter theft, protect staff and customers from violence, and wasn’t about race.

When Reuters sent its findings to the retailer, Rite Aid said that it had stopped using its facial recognition software and that all the cameras had been turned off.

“This decision was in part based on a larger industry conversation,” the company told Reuters in a statement, adding that “other large technology companies seem to be scaling back or rethinking their efforts around facial recognition given increasing uncertainty around the technology’s utility.”

Rite Aid declined to disclose which locations used the technology. In its investigation, Reuters found facial recognition cameras at 33 of the 75 Rite Aid shops in Manhattan and the central Los Angeles metropo areas during one or more visits from October through July. These cameras were placed mainly in Black and poorer neighborhoods.

The cameras, which were easily seen hanging from the ceiling on poles near store entrances and in cosmetics aisles, matched facial images of customers entering a store to those of people Rite Aid previously observed engaging in potential criminal activity. If there was a match, an alert was sent to security agents’ smartphones. 

In a statement to Reuters, Rite Aid said that customers had been apprised of the technology through “signage” at the shops, as well as in a written policy posted this year on its website. However, Reuters reporters found no notice of the surveillance in more than a third of the stores they visited with facial recognition cameras.

Among the 75 stores Reuters visited, those in areas that were poorer or less white were much more likely to have the equipment, the news agency’s statistical analysis found.

“Seventeen of 25 stores in poorer areas had the systems. In wealthier areas, it was 10 of 40. (Ten of the stores were in areas whose wealth status was not clear. Six of those stores had the equipment.),” Reuters reported.

Reuters’ investigation illustrates “the dire need for a national conversation about privacy, consumer education, transparency, and the need to safeguard the Constitutional rights of Americans,” said Carolyn Maloney, the Democratic chairwoman of the House oversight committee, which has held hearings on the use of facial recognition technology.

Rite Aid isn’t the only retailer using facial recognition technology.

“There are a handful of retailers that have made the decision, ‘Look, we need to leverage tech to sell more and lose less,” said the director of the retailer-supplier coalition Loss Prevention Research Council Read Hayes. 

Home Depot admitted it had been testing facial recognition to reduce shoplifting in at least one of its stores but stopped the trial this year. Walmart tried out facial recognition in a handful of stores. Independent 7-Eleven franchise owners in Virginia conducted trials of the software starting in 2018 and later dropped it. 

Listen to GHOGH with Jamarlin Martin | Episode 73: Jamarlin Martin Jamarlin makes the case for why this is a multi-factor rebellion vs. just protests about George Floyd. He discusses the Democratic Party’s sneaky relationship with the police in cities and states under Dem control, and why Joe Biden is a cop and the Steve Jobs of mass incarceration.

The main problem with facial recognition technology is that it is not accurate when it comes to recognition of Black people, as several studies have shown. 

Facial recognition technology is racially biaed. It has misidentified Black people and people of color more often than white people. A landmark federal study by the National Institute of Standards and Technology found that Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. 

“The faces of African-American women were falsely identified more often in the kinds of searches used by police investigators where an image is compared to thousands or millions of others in hopes of identifying a suspect, The Washington Post reported.

Because of this, there has been a pushback against facial recognition technology. “Concerns over the unregulated use of facial recognition in the U.S., both by law enforcement and private companies, has been steadily growing over the last few years, fueled by studies that show the tech in its current form to be inherently flawed and more likely to misclassify the gender and identity of Black individuals,” The Verge reported.