Facebook has said it is looking into further limitations on how job, housing and credit ads are targeted in markets outside North America, after a campaign group said the platform had failed to prevent discriminatory targeting in UK job ads appearing on the network.
Global Witness, a London-headquartered NGO, submitted two ads to Facebook. It asked for one not to be shown to women and the other not to be shown to people over the age of 55, the BBC reported.
Despite Facebook requiring Global Witness to declare that it would not discriminate against these groups, the ads were approved, although they were subsequently pulled before going live.
In addition, Global Witness created four ads for real vacancies on recruitment platform indeed.com, specifying only that they should be seen by UK adults, in order to test the outcome of Facebook’s targeting algorithm.
The test revealed an overwhelming gender skew in two cases: those who saw an ad for mechanics were 96% male, while those seeing an ad for nursery nurses were 95% female. The other two ads also had a significant skew: one for airline pilots (75% male) and for psychologists (77% female).
Global Witness has made a submission to the UK Equality and Human Rights Commission, in which its barrister, Schona Jolly QC, said: "Facebook's system itself may, and does appear to, lead to discriminatory outcomes."
Facebook has faced questions around allegedly discriminatory ad targeting for several years. In 2017, it updated is policies to ban targeting on a series of personal characteristics, including race. Two years later, though, it was charged by the US Department of Housing and Urban Development for allowing discrimination in housing ads.
In 2019 it introduced new limitations on targeting options for housing, employment and credit ads in the US and rolled these out in Canada last year, but they have not yet been extended to other markets, including the UK.
Referring to the apparent ability to run job ads with discriminatory targeting, Naomi Hirst, who led Global Witness's investigation, said: "The fact that it is possible to do this on Facebook in the UK is particularly shocking,"
A Facebook spokesperson commented: “Our system takes into account different kinds of information to try and serve people ads they will be most interested in, and we are reviewing the findings within this report.
“We’ve been exploring expanding limitations on targeting options for job, housing and credit ads to other regions beyond the US and Canada, and plan to have an update in the coming weeks.”
Facebook’s policies state that “Ads must not discriminate or encourage discrimination against people based on personal attributes such as race, ethnicity, colour, national origin, religion, age, sex, sexual orientation, gender identity, family status, disability, medical or genetic condition.”
Its spokesperson said it used a “persistent” prompt to remind advertisers that they are required to comply with this policy, and said it had made investments in advertiser education.
Facebook also pointed to Fairness Flow, its toolkit for examining the social impact of its use of artificial intelligence, such as instances of statistical bias in algorithms – including the breakdown of users who received the four job ads in Global Witness’ test. However, Fairness Flow has been criticised as “insufficient” by some academic experts in AI.