• Tue. Feb 27th, 2024

Algorithms Allegedly Penalized Black Renters. The US Government Is Watching

Algorithms Allegedly Penalized Black Renters. The US Government Is Watching

Two years ago, Mary Louis submitted an application to rent an apartment at Granada Highlands in Malden, Massachusetts. She liked that the unit had two full bathrooms and that there was a pool on the premises. But the landlord denied her the apartment, allegedly due to a score assigned to her by a tenant-screening algorithm made by SafeRent.

Louis responded with references to prove 16 years of punctual rent payments, to no avail. Instead she took a different apartment that cost $200 more a month in an area with a higher crime rate. But a class-action filed by Louis and others last May argues that SafeRent scores based in part on information in a credit report amounted to discrimination against Black and Hispanic renters in violation of the Fair Housing Act. The groundbreaking legislation prohibits discrimination on the basis of race, disability, religion, or national origin and was passed in 1968 by Congress a week after the assassination of Martin Luther King Jr.

That case is still pending, but the US Department of Justice last week used a brief filed with the court to send a warning to landlords and the makers of tenant-screening algorithms. SafeRent had argued that algorithms used to screen tenants aren’t subject to the Fair Housing Act, because its scores only advise landlords and don’t make decisions. The DOJ’s brief, filed jointly with the Department of Housing and Urban Development, dismisses that claim, saying the act and associated case law leave no ambiguity.

“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities,” Department of Justice civil rights division leader Kristen Clarke said in a statement.

Like in many areas of business and government, algorithms that assign scores to people have become more common in the housing industry. But although claimed to improve efficiency or identify “better tenants,” as SafeRent marketing material suggests, tenant-screening algorithms could be contributing to historically persistent housing discrimination, despite decades of civil rights law. A 2021 study by the US National Bureau of Economic Research that used bots using names associated with different groups to apply to more than 8,000 landlords found significant discrimination against renters of color, and particularly African Americans. 

“It’s a relief that this is being taken seriously—there’s an understanding that algorithms aren’t inherently neutral or objective and deserve the same level of scrutiny as human decisionmakers,” says Michele Gilman, a law professor at the University of Baltimore and former civil rights lawyer at the Department of Justice. “Just the fact that the DOJ is in on this I think is a big move.”

2020 investigation by The Markup and Propublica found that tenant-screening algorithms often encounter obstacles like mistaken identity, especially for people of color with common last names. A Propublica assessment of algorithms made by the Texas-based company RealPage last year suggested it can drive up rents.

Image and article originally from www.wired.com. Read the original article here.