SAN FRANCISCO — Meta agreed to alter its ad-targeting technology and pay a $115,054 fine Tuesday in a settlement with the Justice Department over allegations that the company discriminated in housing by allowing advertisers to restrict who could see the ads on the site. platform based on your race, gender, and zip code.
Under the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-assisted method that aims to regularly check whether the audiences it targets and are eligible to receive housing ads are, in fact, seeing those ads. The new method, which is known as a “variance reduction system,” relies on machine learning to ensure that advertisers deliver housing-related ads to specific protected classes of people.
Meta also said it will no longer use a feature called “special ad audiences,” a tool it had developed to help advertisers expand the groups of people their ads would reach. The company said the tool was an initial effort to fight bias and its new methods would be more effective.
“Occasionally, we’ll take a snapshot of marketers’ audiences, see who they’re targeting, and remove as much variance as we can from that audience,” Roy L. Austin, Meta’s vice president of civil rights and deputy general counsel. he said in an interview. He called it “a significant technological advance in how machine learning is used to deliver personalized ads.”
Facebook, which became a commercial juggernaut by collecting data from its users and allowing advertisers to target ads based on the characteristics of an audience, has faced complaints for years that some of those practices are biased and discriminatory. The company’s ad systems have allowed marketers to choose who saw their ads using thousands of different features, which has also allowed those advertisers to exclude people who fall into a number of protected categories.
Read more about Artificial Intelligence
While Tuesday’s deal refers to housing ads, Meta said it also plans to apply its new system to verify the targeting of job- and credit-related ads. The company has previously faced criticism for allowing bias against women in job ads and excluding certain groups of people from seeing credit card ads.
“Because of this groundbreaking lawsuit, Meta will, for the first time, change its ad delivery system to address algorithmic discrimination,” US Attorney Damian Williams said in a statement. “But if Meta fails to show that it has changed its delivery system enough to protect against algorithmic bias, this office will proceed with litigation.”
The issue of biased ad targeting has been discussed especially in housing ads. In 2018, Ben Carson, the then secretary of the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having ad systems that “illegally discriminated” based on categories such as race, religion and disability. Facebook’s potential for advertising discrimination was also revealed in a 2016 investigation by ProPublica, which showed the company made it easy for marketers to exclude specific ethnic groups for advertising purposes.
In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems didn’t deliver ads to “a diverse audience,” even if an advertiser wanted the ad to be seen widely.
“Facebook discriminates against people based on who they are and where they live,” Carson said at the time. “Using a computer to limit someone’s housing options can be as discriminatory as slamming a door in someone’s face.”
HUD’s lawsuit came amid a broader push by civil rights groups who say the vast and complicated advertising systems that underpin some of the biggest Internet platforms have inherent bias built in, and that tech companies like Meta , Google and others should do more to combat reversing those biases.
The area of study, known as “algorithmic fairness,” has been a topic of great interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have sounded the alarm about such bias for years.
In the years since then, Facebook has clamped down on the types of categories marketers can choose from when buying housing ads, reducing the number to hundreds and removing options to target based on race, age and code. Postcard.
Meta’s new system, which is still in development, will occasionally check who is being shown housing, job and credit ads, and make sure those audiences match the people marketers want to target. If the ads that run start to skew heavily toward white males in their 20s, for example, the new system will theoretically recognize this and change the ads to run more equitably across broader and more varied audiences.
Meta said it will work with HUD in the coming months to incorporate the technology into Meta’s ad targeting systems and has agreed to a third-party audit of the new system’s effectiveness.
The fine Meta is paying in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.