Meta Platforms Inc. settled a lawsuit with the Department of Justice today over allegations that its algorithm for targeting people with housing ads discriminated.
The company paid $115,054 after allegations emerged that under Facebook’s “Special Ad Audience” advertising certain users based on their race, color, nationality, zip code, disability, familial status, and gender. Meta has agreed to use targeted ad systems from now on only with Justice Department approval and court oversight.
“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” said U.S. Attorney Damian Williams in a statement. “Because of this ground-breaking lawsuit, Meta will—for the first time—change its ad delivery system to address algorithmic discrimination.”
He added that if Meta fails again to meet the standards of the Fair Housing Act, it will be subject to being dragged into court again. For its part, Meta said in a press release that is now developing “a novel use of machine learning technology that will work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.”
The company added that advertisers are already limited in regard to targeted, but it acknowledged that more needs to be done. Meta said it will not only put an end to discrimination in housing ads, but from now on will ensure it doesn’t happen in the realm of employment ads and credit ads. In the past, a woman might not have been targeted for certain jobs and credit card ads might not have appeared on the timelines of folks not deemed financially secure.
Meta called this approach “unprecedented in the advertising industry.” The Special Ad Audiences, which replaced “Lookalike Audiences” for matters of fairness, will be retired and machine learning will take over. The new system will take some time to finish, but when it’s done, Meta will keep checking it to see if it’s discriminatory. The system should be able to learn from its mistakes and reach a more balanced spectrum of people.