U.S. renters fall foul of algorithms in search for a home

A man looks at advertisements for luxury apartments and homes in the window of Real Estate sales business in Manhattan's upper east side neighborhood in New York City, New York, U.S. October 19, 2021

A man looks at advertisements for luxury apartments and homes in the window of Real Estate sales business in Manhattan's upper east side neighborhood in New York City, New York, U.S. October 19, 2021. REUTERS/Mike Segar

What’s the context?

Washington D.C. bill targeting algorithmic discrimination hailed as potential model for wider crackdown on tenant-screening tools

  • Hundreds of automated tenant-screening programs in use
  • Applicants report vague reasons for housing rejection
  • Discrimination fears spur D.C. drive to rein in algorithms

WASHINGTON - Candice's latest rejection email for a housing rental in Washington D.C. looked much like all her others - a generic response with no proper explanation of the decision.

"It will just say, 'unfortunately, right now we didn't accept your application,'" said Candice, 36, who has lived in the U.S. capital for most of her life and is now looking for a larger home for herself and her three children.

"I felt it was computer-generated. And of course, computers – they're faulty," she told Context, asking to be identified by only her first name.

Yard signs oppose a proposal aimed at boosting housing options in Arlington, Virginia, in October. Thomson Reuters Foundation/Carey L. Biron
Go DeeperIn U.S. midterm elections, rising rents stir up tenants
A security camera sits on a building in New York City March 6, 2008. The New York City Police Department are using evidence from video tapes from nearby buildings to catch a suspect involved in a bombing at the Armed Forces Career Center. REUTERS/Joshua
Go DeeperAI surveillance takes U.S. prisons by storm
Go DeeperYour manager is an algorithm. Here’s how to fight back

Candice, who said she had received about 10 such rejections in recent months, is currently unemployed but benefits from welfare assistance that would cover her rent in full.

She and other would-be tenants think their applications for rental housing are falling foul not of landlords, but of automated screening programs that scan credit scores, eviction or criminal histories and even social media activity to determine if an applicant is a rental risk.

The widely used programs are facing increased scrutiny from lawmakers in Washington D.C. and beyond amid broader concern about the potential of algorithms to lock in bias and perpetuate inequality.

Susie McClannahan, who manages the fair housing rights program at the Equal Rights Center civil rights group and has worked with Candice, calls it the "black box of algorithmic discrimination."

Rental applicants are "being denied at properties for reasons they don't know, and that the provider might not even know," McClannahan said, adding that some third-party screening systems used data they were banned from using, such as old criminal convictions.

"For renters with housing vouchers and low-income renters ... it's making it harder for them to find housing in a city that's already in the midst of a housing crisis," she said.

A 'For Rent, For Sale' sign is seen outside of a home in Washington, U.S., July 7, 2022

A "For Rent, For Sale" sign is seen outside of a home in Washington, U.S., July 7, 2022. REUTERS/Sarah Silbiger

A "For Rent, For Sale" sign is seen outside of a home in Washington, U.S., July 7, 2022. REUTERS/Sarah Silbiger

City lawmakers are taking note. In September, they debated legislation to ban "discrimination by algorithms," including in housing – one of several efforts nationwide.

And last month, the White House released a "Blueprint for an AI Bill of Rights," warning that "discrimination by algorithms" is unacceptable.

Regulatory action on the issue is likely in the coming year, said Ben Winters, counsel at the Electronic Privacy Information Center (EPIC) watchdog group.

"We're at a transition point," he said.

Growth sector

The tenant-screening industry, worth around $1 billion, is drawing interest from tech startups and venture capital, according to the Tech Equity Collaborative, a watchdog group.

There are hundreds of tenant-screening tools available in the United States, supplanting a process traditionally undertaken by landlords, said Cynthia Khoo, a senior associate with Georgetown University's Center on Privacy & Technology.

While that process was also open to discrimination, she said today's automated tools operate far more efficiently, at greater scale and greater speed, and with access to far more data.

"These are new technological tools being used to carry out the same age-old discrimination we're familiar with," she said, adding that they were even less transparent.

As regulators in California and Colorado, and at the Federal Trade Commission, work on the issue, many are watching the capital's Stop Discrimination by Algorithms Act (SDAA) as a potential blueprint.

"This is the most robust legislation in the U.S.," Khoo said of the bill.

The current draft states that algorithms cannot discriminate against any groups already protected under local law, said Winters, while applicants would have to be alerted to the use of these systems and given explanations if rejected.

Most firms using these tools would have to audit their algorithms to make sure they knew what the programs were doing, he said, and applicants would be able to sue over potential infractions.

In response to a request for comment, the Consumer Data Industry Association, a trade group, referred to testimony it gave in opposition to the SDAA, as well as a letter sent to the D.C. Council in October by nine financial services groups.

The letter noted that companies were already prohibited from discrimination in credit or other financial services, and that the D.C. bill would increase the potential for fraud and hit credit access.

"Algorithms make credit decisions more accurate, fair, faster and more affordable by judging applicants on their credit worthiness," the groups said.

"Algorithms also eliminate some of the risk of the biases that can be found in human interactions and can help identify products and services designed to benefit communities, including historically underserved populations, helping close the racial wealth gap."

A person types on a laptop computer in Manhattan, New York City, U.S., September 11, 2020

A person types on a laptop computer in Manhattan, New York City, U.S., September 11, 2020. REUTERS/Andrew Kelly

A person types on a laptop computer in Manhattan, New York City, U.S., September 11, 2020. REUTERS/Andrew Kelly

'Tainted' data

Yet some question whether algorithms drawing on public data can be objective when the data itself is tainted, said Catherine D'Ignazio, an associate professor of urban science and planning at the Massachusetts Institute of Technology (MIT).

Often data such as credit scores that seems objective is actually the result of decades of racism or marginalization – thus baking bias into the math, she said.

The idea of algorithmic fairness suggests that "everyone starts equally and is treated equally. But history hasn't treated people equally."

Still, recognizing this disconnect offers an opportunity for change for the better, D'Ignazio said.

"Tainted" historical data can also skew home valuations, said John Liss, founder of True Footage, whose company launched last year with an eye to addressing appraisal gaps between white and minority homeowners by using a combination of automation and human oversight.

For years, home appraisals often did not seem tied to data, Liss said – to the particular detriment of Black and Hispanic homeowners.

While bringing automation into the appraisal process helps to address this in part, he said, "automated valuation models are extremely dangerous because they're tainted" by historical data.

For True Footage, he said, the key is to have human appraisers, increasingly drawn from historically marginalized communities, involved in interpreting the data.

"There's a place for technology," Liss said. "(But) having a human at the wheel to interpret the data is much more accurate."

This article was corrected on 16 Nov. to amend the description of the Equal Rights Centre.

(Reporting by Carey L. Biron; Editing by Helen Popper.)


Context is powered by the Thomson Reuters Foundation Newsroom.

Our Standards: Thomson Reuters Trust Principles


Tags

  • Facial recognition
  • Digital IDs
  • Housing
  • Surveillance
  • Tech regulation

Featured Podcast

An illustration photo shows the globe with a tree standing on top. On the left hand side, a red backed illustration shows barren trees and oil refinery towers. On the right hand side, a green backed illustration shows wind turbines and solar panels. A sound equaliser image crosses the screen to indicates audio.
6 EPISODES
Podcast

Just Transition

The human stories behind the shift to a green economy

An illustration photo shows the globe with a tree standing on top. On the left hand side, a red backed illustration shows barren trees and oil refinery towers. On the right hand side, a green backed illustration shows wind turbines and solar panels. A sound equaliser image crosses the screen to indicates audio.
Podcast




Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Latest on Context