U.S. renters fall foul of algorithms in search for a home
A man looks at advertisements for luxury apartments and homes in the window of Real Estate sales business in Manhattan's upper east side neighborhood in New York City, New York, U.S. October 19, 2021. REUTERS/Mike Segar
What’s the context?
Washington D.C. bill targeting algorithmic discrimination hailed as potential model for wider crackdown on tenant-screening tools
- Hundreds of automated tenant-screening programs in use
- Applicants report vague reasons for housing rejection
- Discrimination fears spur D.C. drive to rein in algorithms
WASHINGTON - Candice's latest rejection email for a housing rental in Washington D.C. looked much like all her others - a generic response with no proper explanation of the decision.
"It will just say, 'unfortunately, right now we didn't accept your application,'" said Candice, 36, who has lived in the U.S. capital for most of her life and is now looking for a larger home for herself and her three children.
"I felt it was computer-generated. And of course, computers – they're faulty," she told Context, asking to be identified by only her first name.
Candice, who said she had received about 10 such rejections in recent months, is currently unemployed but benefits from welfare assistance that would cover her rent in full.
She and other would-be tenants think their applications for rental housing are falling foul not of landlords, but of automated screening programs that scan credit scores, eviction or criminal histories and even social media activity to determine if an applicant is a rental risk.
The widely used programs are facing increased scrutiny from lawmakers in Washington D.C. and beyond amid broader concern about the potential of algorithms to lock in bias and perpetuate inequality.
Susie McClannahan, who manages the fair housing rights program at the Equal Rights Center civil rights group and has worked with Candice, calls it the "black box of algorithmic discrimination."
Rental applicants are "being denied at properties for reasons they don't know, and that the provider might not even know," McClannahan said, adding that some third-party screening systems used data they were banned from using, such as old criminal convictions.
"For renters with housing vouchers and low-income renters ... it's making it harder for them to find housing in a city that's already in the midst of a housing crisis," she said.
City lawmakers are taking note. In September, they debated legislation to ban "discrimination by algorithms," including in housing – one of several efforts nationwide.
And last month, the White House released a "Blueprint for an AI Bill of Rights," warning that "discrimination by algorithms" is unacceptable.
Regulatory action on the issue is likely in the coming year, said Ben Winters, counsel at the Electronic Privacy Information Center (EPIC) watchdog group.
"We're at a transition point," he said.
The tenant-screening industry, worth around $1 billion, is drawing interest from tech startups and venture capital, according to the Tech Equity Collaborative, a watchdog group.
There are hundreds of tenant-screening tools available in the United States, supplanting a process traditionally undertaken by landlords, said Cynthia Khoo, a senior associate with Georgetown University's Center on Privacy & Technology.
While that process was also open to discrimination, she said today's automated tools operate far more efficiently, at greater scale and greater speed, and with access to far more data.
"These are new technological tools being used to carry out the same age-old discrimination we're familiar with," she said, adding that they were even less transparent.
As regulators in California and Colorado, and at the Federal Trade Commission, work on the issue, many are watching the capital's Stop Discrimination by Algorithms Act (SDAA) as a potential blueprint.
"This is the most robust legislation in the U.S.," Khoo said of the bill.
The current draft states that algorithms cannot discriminate against any groups already protected under local law, said Winters, while applicants would have to be alerted to the use of these systems and given explanations if rejected.
Most firms using these tools would have to audit their algorithms to make sure they knew what the programs were doing, he said, and applicants would be able to sue over potential infractions.
In response to a request for comment, the Consumer Data Industry Association, a trade group, referred to testimony it gave in opposition to the SDAA, as well as a letter sent to the D.C. Council in October by nine financial services groups.
The letter noted that companies were already prohibited from discrimination in credit or other financial services, and that the D.C. bill would increase the potential for fraud and hit credit access.
"Algorithms make credit decisions more accurate, fair, faster and more affordable by judging applicants on their credit worthiness," the groups said.
"Algorithms also eliminate some of the risk of the biases that can be found in human interactions and can help identify products and services designed to benefit communities, including historically underserved populations, helping close the racial wealth gap."
Yet some question whether algorithms drawing on public data can be objective when the data itself is tainted, said Catherine D'Ignazio, an associate professor of urban science and planning at the Massachusetts Institute of Technology (MIT).
Often data such as credit scores that seems objective is actually the result of decades of racism or marginalization – thus baking bias into the math, she said.
The idea of algorithmic fairness suggests that "everyone starts equally and is treated equally. But history hasn't treated people equally."
Still, recognizing this disconnect offers an opportunity for change for the better, D'Ignazio said.
"Tainted" historical data can also skew home valuations, said John Liss, founder of True Footage, whose company launched last year with an eye to addressing appraisal gaps between white and minority homeowners by using a combination of automation and human oversight.
For years, home appraisals often did not seem tied to data, Liss said – to the particular detriment of Black and Hispanic homeowners.
While bringing automation into the appraisal process helps to address this in part, he said, "automated valuation models are extremely dangerous because they're tainted" by historical data.
For True Footage, he said, the key is to have human appraisers, increasingly drawn from historically marginalized communities, involved in interpreting the data.
"There's a place for technology," Liss said. "(But) having a human at the wheel to interpret the data is much more accurate."
This article was corrected on 16 Nov. to amend the description of the Equal Rights Centre.
(Reporting by Carey L. Biron; Editing by Helen Popper.)
Today On Context