Can Algorithms Be Racist? Trump’s Housing Department Says No
This story was produced by Reveal from The Center for Investigative Reporting, a nonprofit news organization. Get their investigations emailed to you directly by signing up at revealnews.org/newsletter.
The U.S. Department of Housing and Urban Development is circulating new rules that would make it nearly impossible for banks – or landlords or homeowners insurance companies – to be sued when their algorithms result in people of color being disproportionately denied housing.
The rule would overturn 50 years of precedent, upheld by the Supreme Court in 2015, that permit the use of statistical analysis to identify patterns of discrimination.
Under the Trump administration’s proposed regulations, a company accused of discrimination would be able “defeat” that claim if an algorithm is involved. A hypothetical bank that rejected every loan application filed by African Americans and approved every one filed by white people, for example, would need to prove only that race or a proxy for it was not used directly in constructing its computer model.
The rule introduces other loopholes for businesses to knock down discrimination claims as well. A business could “defeat” a claim by saying it had vetted its algorithm with a neutral third party. If an algorithm that resulted in discrimination were developed by an outside firm, such as a credit bureau or tech company, a bank that used it could not be held accountable for the result.
“People who use their positions and power to discriminate against others don’t always broadcast their intent,” said Vanita Gupta, president and CEO of The Leadership Conference on Civil and Human Rights, who led the civil rights division of the Justice Department during the Obama administration.
“The Trump administration is trying to make it impossible for us to challenge algorithmic bias and technological bias within the housing market,” added Lisa Rice, president of the National Fair Housing Alliance.
For decades, government lawyers and civil rights groups have been able to hold businesses accountable for treating people equally, whatever their intent. This is called the disparate impact standard. Under this standard, discrimination doesn’t have to be overt. If a business has a pattern of denying services to people of color while providing those same services to socioeconomically similar white customers, it can be held accountable under the Fair Housing Act of 1968.
At the Department of Housing and Urban Development, spokesman Brian Sullivan declined to comment on the proposed rules, saying only that they had been sent to Congress for a “prepublication review.”
Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University, said unequal treatment is inevitable and the new rules take into account the complexities of technology.
“We should be holding every business accountable to eliminate discrimination, but we also know that its super hard to do that for even well-meaning businesses,” Goldman said. “We’re just not going to eliminate that.”
The homeownership gap between black and white people is now wider than it was during the Jim Crow era, when segregation was legal and encouraged by the government.
And as an increasing amount of financial decision-making has moved from humans to computers, concern about systemic discrimination caused by algorithms has grown.
Last year, Reveal from The Center for Investigative Reporting used a statistical analysis in an award-winning series on modern-day redlining. We found that in 61 cities, people of color were far more likely to be turned down for a home loan than their white counterparts – even when they made the same amount of money and tried to take out the same size loan and buy in the same neighborhood. The series led major banks, including JPMorgan Chase and TD Bank, to open new branches in communities of color. It also sparked investigations by six state attorneys general.
At the time, the American Bankers Association criticized our analysis, saying it didn’t show discrimination. A spokesman for the industry group said it was withholding comment until the rules on algorithms were finalized.
Reveal has hardly been alone in its findings. A recent study by researchers at UC Berkeley’s Haas School of Business found that while Latino and African American borrowers faced less discrimination from financial technology companies than brick-and-mortar lenders, businesses that relied on computers instead of face-to-face interaction nonetheless habitually charged borrowers of color higher interest rates than white borrowers with similar credit profiles – costing black and brown customers an additional $765 million annually.
In Connecticut, the real estate firm CoreLogic is facing a fair housing lawsuit over the algorithm embedded in its CrimSAFE tenant screening service, which critics say “disproportionately disqualifies African Americans and Latinos.” The plaintiff, the Connecticut Fair Housing Center, argued that because the state’s criminal justice system disproportionately arrests African Americans and Latinos, relying on it as a standard for renting apartments is inherently biased.
The center is representing a Latino man with a disability that left him unable to walk, talk or care for himself, who nonetheless was barred from moving in with his mother because of a “disqualifying criminal record.” The offense CrimSAFE had identified turned out to be an old shoplifting infraction – below the level of misdemeanor.
In March, U.S. District Judge Vanessa Bryant allowed the suit to move forward. Yet if HUD’s proposed changes go forward, discrimination in cases like this will be difficult to prove.
“And my fear is that it will chill those cases,” said Chris Peterson of the advocacy group Consumer Federation of America, “and embolden businesses to be less careful about unintentional discrimination that could occur.”
It could be the first step down a slippery slope, said Rice of the National Fair Housing Alliance. “Once they strike it down in housing, they’re going to try and strike it down in education and they’ll try and strike it down in transportation and health care and so forth and so on, and we just cannot let that happen.”
This story was edited by Matt Thompson and copy edited by Nikki Frick. Aaron Glantz can be reached at email@example.com, and Emmanuel Martinez can be reached at firstname.lastname@example.org. Follow Glantz on Twitter: @Aaron_Glantz.