fbpx

Police Rely Increasingly On Facial Recognition Software With Catastrophic Results

Police Rely Increasingly On Facial Recognition Software With Catastrophic Results

Listen to GHOGH with Jamarlin Martin | Episode 29: Alfred Liggins

Jamarlin talks to Alfred Liggins, CEO of Urban One (NASDAQ: UONEK) about why he never merged with BET and whether going public inspired the Fox series “Empire”. They discuss the Democratic Party neglecting Black media when it comes to campaign ad spending, and the disconnect between Black CEOs and Obama.

____________________________________________________________________________________

Willie Lynch says he was unfairly tried as a result of facial recognition software that law enforcement agencies are increasingly relying on to charge suspects despite the lack of transparent standards for use.

The technology notoriously fails Black people. It’s part of a bigger problem — technology’s diversity problem — where algorithms written by mostly white men are biased and have a hard time understanding different skin colors, different shades and genders.

The Jacksonville Sheriff’s Office used facial recognition software to try and find a drug dealer from whom they’d bought crack undercover in 2016. Based on photos they took of the suspect using a smartphone, Lynch was arrested, tried and sentenced to eight years.

The flawed technology makes Lynch’s conviction questionable. The fact that his accusers used the technology as the basis for identifying and arresting Lynch makes it even more questionable, according to a report in Logic Mag.

Depositions made in Lynch’s appeal offered details on how the sheriff’s office crime analyst didn’t understand the algorithms driving the software’s match-making process, the Florida Time Union reported in May 2017.

The software scans Americans with no criminal histories via driver’s license and ID photo databases. At a Congressional hearing in March 2017, lawmakers raised concerns about the technology’s accuracy, invasion of privacy and the potential for abuse of an unregulated system.

In the U.S., where 50 percent of us have our faces in databases that may be available to law enforcement, algorithmic bias has had tragic consequences for Black people and people of color, said Brian Brackeen founder of Miami facial recognition firm Kairos.

There’s insufficient training data to teach the algorithm what women are vs. men, and what people of darker shades are vs lighter shades, according to Brackeen, who spoke at a panel at SXSW 2018.

Brackeen told Moguldom he plans to produce a data set with people who have been traditionally marginalized and make it available to the world, he said, “so that all of us can create better algorithms that reflect all humanity.”

An absence of diverse data sets is one of the challenges that developers face, said Clare Garvie, founding executive director of the Center on Privacy & Technology at Georgetown Law, in a Moguldom interview.

A 2016 study by the Center on Privacy and Technology at Georgetown Law found that facial recognition software was most likely to be incorrect when used on Black people — a finding corroborated by FBI research. It also found that Black people were the most likely to be scrutinized by facial recognition software in cases, Logic Mag reported.

Right now, most data sets like the ones Microsoft and Google use come from university campuses and they’re trained on the population at universities, Brackeen said. “So if a campus like USC has more white males than other groups, the algorithm will know better how to identify white males than black female faces.”

Lynch’s case may be a benchmark.

“So far it’s the case that has dealt most directly and most broadly with facial recognition,” Garvie told Jacksonville.com. Garvie has used court records from the case to educate public defenders on how the technology could apply to criminal cases. “This is the first time that we actually have a case where the court is considering what to do with it.”

Facial Recognition Software
FILE – In this March 12, 2015, file photo, Seattle police officer Debra Pelich, right, wears a video camera on her eyeglasses as she talks with Alex Legesse before a small community gathering in Seattle. While the Seattle Police Department bars officers from using real-time facial recognition in body camera video, privacy activists are concerned that a proliferation of the technology could turn the cameras into tools of mass surveillance. The ACLU and other organizations on Tuesday, May 22, 2018, asked Amazon to stop selling its facial-recognition tool, called Rekognition, to law enforcement agencies. (AP Photo/Elaine Thompson, File)