Algorithmic Bias and the New Face of Housing Discrimination.
- Fannie Mae, Desktop Underwriter Version 11.0 Release Notes (2022). Automated underwriting systems are trained on historical loan performance data. Because historical lending patterns reflect decades of discriminatory underwriting, the training data itself embeds the outcomes of prior discrimination. The system learns which borrower profiles historically produced defaults — profiles shaped in part by exclusion from refinancing, from prime products, and from wealth-building opportunities.
- U.S. Census Bureau, “Quarterly Residential Vacancies and Homeownership,” Q4 2024. The Black homeownership rate of 44.7 percent and white homeownership rate of 73.8 percent produce a racial gap of 29.1 percentage points — wider than the gap recorded in 1968 at the time of the Fair Housing Act’s passage.
- National Fair Housing Alliance v. Facebook, Inc., No. 1:18-cv-02689 (S.D.N.Y. 2018). The complaint documented that Facebook’s ad delivery algorithm steered housing advertisements away from users based on race, national origin, religion, sex, familial status, and disability — protected classes under the Fair Housing Act — without any explicit targeting instruction from advertisers.
- U.S. Department of Housing and Urban Development, Conciliation Agreement between HUD and Meta Platforms, Inc., June 21, 2022. Meta agreed to overhaul its ad delivery system for housing, employment, and credit categories and to submit to regular audits. The settlement resolved HUD’s charge that Facebook’s use of its Special Ad Audience tool violated the Fair Housing Act by using algorithmic proxies for protected characteristics to limit ad distribution.
- Faber, Jacob W. “Fortifying the Walls: How Digital Advertising Platforms Facilitate Housing Segregation.” Housing Policy Debate 33, no. 1 (2023): 1–21. The study documents how digital platforms’ use of lookalike audiences and behavioral targeting in housing advertising produces geographic and demographic segregation in who receives homeownership information — independent of any explicit discriminatory intent by advertisers.
- Bartlett, Robert, Adair Morse, Richard Stanton, and Nancy Wallace. “Consumer-Lending Discrimination in the FinTech Era.” Journal of Financial Economics 143, no. 1 (January 2022): 30–56. The study analyzed 9 million mortgage records and found that FinTech lenders charged Black and Hispanic borrowers approximately 8 basis points more than similarly qualified white borrowers, generating approximately $765 million in excess interest payments annually.
- National Fair Housing Alliance, 2023 Fair Housing Trends Report (Washington, D.C.: NFHA, 2023). The report documents the systematic testing methodology used to identify algorithmic discrimination in online housing platforms and advertising networks, and catalogs the enforcement actions and settlements that have resulted from that testing.