Our solutions are tailored to each client’s strategic business drivers, technologies, corporate structure, and culture.
HUD offers Fair Housing Act guidance on AI applications
HUD issued new guidance to housing industry stakeholders on the use of AI in advertising to potential tenants and in applicant screening. Learn more.
On May 2, 2024, the U.S. Department of Housing and Urban Development (HUD) issued Fair Housing Act (the Act) guidance to housing industry stakeholders on the use of artificial intelligence (AI) in both advertising to potential tenants and in applicant screening.
The guidance, which provides insights into how HUD views the potential impact of AI on fair housing, helps owners of rental housing and their management agents (collectively, housing providers) take proactive measures in addressing common risks associated with the use of AI.
How does AI impact compliance with HUD Fair Housing Act requirements?
In many instances, the advertising and the initial screening are performed by third parties hired by the housing providers. These third parties increasingly use automated algorithms and tools on advertising platforms to refine targeted advertising to potential tenants and to screen applicants for rental housing.
While the use of AI in this scenario is generally well intentioned, housing providers are encouraged to be mindful of its use as the automated algorithms can unintentionally reinforce existing biases. In such cases an owner or management agent may unknowingly find themselves running afoul of the Fair Housing Act.
The Fair Housing Act prohibits discrimination in the sale, rental, and financing of dwellings and in other housing-related services because of race, color, religion, sex, national origin, familial status, or disability.
What does this look like in practice, and what guidance is suggested for housing providers to ensure they are doing what they can to address this risk?
Advertising
While the intent of targeted advertising is to get the right advertising to the most people who are statistically most likely to engage, it may violate the Act if it ultimately deters some potential tenants of a protected class from, or steering others of a protected class to, specific housing.
Advertising algorithms could end up excluding families with children, people with service animals, those of specific language groups or religions, or use census block information to exclude residents of predominantly Black and Hispanic neighborhoods from those seeing the advertisements. This could occur based on the algorithms using the current mix of tenants or historical information about who has lived at the complex. Such exclusions are considered to violate the Act whether or not intentional.
Recommendations from HUD for housing providers include:
- Obtain necessary information and disclosures from the advertising platform regarding how the platform mitigates the risk of discriminatory delivery of housing-related advertising.
- Follow advertising platform instructions to ensure that advertisements related to housing are identified as such to the advertising platform, enabling the appropriate treatment by the vendor.
- Carefully consider the source, and then analyze the composition of the audience datasets used for algorithms that are trying to customize the audience that receives the advertising to ensure it does not create discriminatory target audiences.
- Monitor outcomes of advertising campaigns for housing-related advertising, to the extent possible, to identify and mitigate discriminatory outcomes.
Tenant screening
Tenant screening reports are a tool used to screen out potential problem tenants. Third-party screening companies used by housing providers offer a wide variety of features, such as screening for credit, eviction history, criminal records, medical debt, foreclosures, student loans, etc. Some may include a recommendation in the form of a numerical grade or a “pass/fail” conclusion.
These reports are products of automated software that use algorithms. As with advertising, the intent is not malicious, but is designed to help owners and management agents select tenants who will pay their rent and comply with the terms of their lease agreement.
Screening algorithms may pull all the criminal records of an applicant without differentiating among the types of offenses. Housing providers should take into account information such as actual convictions, the types of crimes, when they occurred (for example, was the last one 20 years ago?) and whether the applicant is likely to commit such a crime in the future (for example, they could have become disabled and have limited mobility, or received treatment for a mental health disability) before they make a final decision.
The use of credit scores on their own may benefit those of certain backgrounds more than others. AI could use the credit score as a good indicator of a good tenant when it actually creates a bias toward those that already have access to credit and eliminates opportunities for those with no, limited, or poor credit. If they were previously evicted, was it because they could not pay rent for a market-rate unit but are now receiving rental assistance? Were there extenuating circumstances that are unlikely to occur again?
Recommendations for housing providers include:
- Understand the limits of the screening reports.
- Try to only use relevant screening criteria. For example, for subsidized housing, income-related information would be less relevant. Similarly, records with no negative income would not be relevant. This would include court information that does not provide the outcome of the proceedings.
- AI may try to anticipate missing information and in doing so introduce errors. Therefore, it is necessary to make sure the screening process only uses complete information.
- Design a process that allows for additional context of negative reports so that denied applications can dispute incorrect information that might have been used in the decision-making process. HUD recommends that the denial notification includes all records relied upon including the screening reports.
Gathering data to make a more informed decision about accepting an applicant mitigates some of the “noise” in the tenant screening reports that are generated and reduces the risk of incorrect rejection of the applicant.
Housing providers are ultimately responsible for ensuring their rental practices are in line with the Act, including those tasks outsourced to third parties. While overreliance on AI-generated information can pose risks, implementing thoughtful processes will help stakeholders protect themselves and their future tenants from unintentional AI-generated bias.
Beth Mullen
Contact
Let’s start a conversation about your company’s strategic goals and vision for the future.
Please fill all required fields*
Please verify your information and check to see if all require fields have been filled in.