A European human rights body has ruled that Facebook’s job advertising algorithm exhibits gender bias, reinforcing stereotypes by disproportionately showing certain job postings to specific genders. The Netherlands Institute for Human Rights determined in a February 18 decision that Meta, Facebook’s parent company, should have monitored and adjusted its algorithm to prevent such discrimination.
The ruling follows an investigation by the international non-profit Global Witness, which found that job ads in multiple countries—including the Netherlands, France, India, Ireland, the UK, and South Africa—were distributed based on gender stereotypes. For instance, mechanic job ads were shown mostly to men, while preschool teacher roles were primarily displayed to women. These findings, reported by CNN, led to formal complaints from Dutch and French human rights organizations.
The Netherlands Institute for Human Rights stated that Meta Platforms Ireland Ltd., responsible for Facebook ads in Europe, failed to prove that its algorithm does not engage in prohibited gender discrimination. The Institute has called on Meta to revise its algorithm to ensure fairer job ad distribution.
Meta has previously stated that it restricts advertisers from targeting job, housing, and credit ads based on gender in over 40 countries, including the Netherlands and France. However, the company has not clarified how its ad delivery system determines audience targeting. A 2020 Facebook blog post mentioned that its ad system relies on multiple factors, including user behavior both on and off the platform.
Berty Bannor of Bureau Clara Wichmann welcomed the Dutch ruling, calling it a significant step toward holding tech companies accountable for digital discrimination. Rosie Sharpe, a senior campaigner at Global Witness, also emphasized that this decision is crucial in ensuring Big Tech is responsible for its algorithmic impact.
While the ruling is not legally binding, experts suggest it could influence regulatory action, including fines or required algorithm modifications. Dutch lawyer Anton Ekker noted that if Meta fails to address the issue, NGOs may pursue legal action to prevent further discrimination.
The decision comes amid growing concerns over digital rights, particularly for women and marginalized groups. Recently, Meta announced the termination of its diversity, equity, and inclusion programs, along with changes to its policies on hate speech and fact-checking in the US. These modifications include permitting previously banned discriminatory language, such as referring to women as “household objects” or allowing derogatory remarks about transgender and non-binary individuals.
Meta has faced multiple allegations of discrimination over the past decade, leading to algorithm modifications in the US. However, critics argue that similar changes should be applied globally to prevent further algorithmic bias and social injustice.