Is Facebook’s job algorithm sexist?

A Global Witness investigation shows big differences between the job adverts that might appear in men’s and women’s feeds

Illustration showing circuit-like design

 

Facebook’s algorithm is perpetuating gender bias when it comes to job adverts, new research suggests, with big differences between the vacancies that might appear in men’s and women’s newsfeeds.

Global Witness, an international human rights group, has published an investigation this month into Facebook’s algorithm, whereby they paid the social media company to post job adverts for a range of professions. In the UK, 94% of the people who were shown their advert for mechanic roles were male, the reach data for these adverts show. 

The reach data also show that 79% of the people shown an advert for pilot jobs were male. Meanwhile 97% of those shown an advert for nursery staff roles, and 79% of those shown an advert for psychologist roles, were female.

Global Witness carried out research in March and April across six countries – the UK, the Netherlands, France, India, Ireland, and South Africa – and found similar trends across all countries. They have shared a breakdown of the UK data with Workingmums.co.uk.

Bar chart showing results of Global Witness research

“What we’ve got here is a major social media platform making decisions that shape your life, without you knowing about it, based on characteristics that are protected in law [from discrimination],” said Naomi Hirst, campaign strategy lead at Global Witness. 

AI’s potential biases are a growing area of concern, as algorithms and automation oversee ever more of our lives. There are fears that, because these systems are “trained” using existing data and other content, they are likely to replicate historical biases against women and minorities. 

In many fields, such as health, there is also a lack of women-specific data to train AI systems properly. AI models built to predict liver disease using blood tests were twice as likely to miss the disease in women as in men, a University College London study found last year.

Global Witness only specified that Facebook should show the adverts to adults who lived in or had recently been in the country in question. But Facebook also has internal rules that advertising campaigns must be shown to the users whom it deems most likely to click through to the destination website – a process that relies on algorithms.

‘It’s a black box’

Facebook’s Help Centre, its customer service hub, sets out “examples” of the information that its algorithm uses to decide which ads you see. The list includes gender and age. But it also includes “your behaviour on Facebook” – data that is based on your actions, in terms of what you have posted or liked, rather than assumptions about you.

Illustration showing circuit-like design

Hirst said that, because Facebook keeps its algorithm’s inner workings “under wraps”, we can’t know for certain how much weight it puts on gender when making a decision (and this will also differ from decision to decision). But she added that there are still grounds for regulators to intervene, if the outcome of what the algorithm does is gendered and results in an overwhelmingly male or female audience seeing certain job ads.

“If the outcome is gendered, that’s where the law can apply to prevent the algorithm from doing that…Because the outcome is preventing people from having access to the workforce and seeing jobs that they could well apply for,” she said.

Meta, which owns Facebook and Instagram, has an Ad Library where anyone can view ads that are currently running on its platforms, in order to boost transparency. Facebook also does not allow advertisers to stipulate that job, housing, or credit adverts are targeted at people on the basis of their age or gender.

Yet it is less clear how its own algorithms operate. Facebook’s algorithm largely remains a mystery to outsiders – the Help Centre’s list of data sources is not exhaustive, and it does not say if or how different types of data are prioritised. 

“It’s a black box…We’re nowhere near getting access to understanding how the algorithm is compiled, how it functions, how it updates,” Hirst said.

When asked to comment on Global Witness’ research, a Meta spokesperson said: “We have applied targeting restrictions to advertisers when setting up campaigns for employment, as well as housing and credit ads, and we offer transparency about these ads in our Ad Library. We do not allow advertisers to target these ads based on gender. We continue to work with stakeholders and experts across academia, human rights groups and other disciplines on the best ways to study and address algorithmic fairness.”



Post a comment

Your email address will not be published. Required fields are marked *

Your Franchise Selection

Click the button below to register your interest with all the franchises in your selection

Request FREE Information Now

Your Franchise Selection

This franchise opportunity has been added to your franchise selection

image

title

Click the button below to register your interest with all the franchises in your selection

Request FREE Information Now


You may be interested in these similar franchises