Is Google's job ads algorithm sexist? New research hints so

Companies such as Google and Facebook generate revenue by tracking user behaviour around the Web and delivering targeted ads.


Companies such as Google and Facebook generate revenue by tracking user behaviour around the Web and delivering targeted ads. How exactly this information is used remains a mystery, but a new research paper suggests some algorithmic judgments from Google's ad system may be a little skewed.

Researchers from Carnegie Mellon University and the International Computer Science Institute built a tool called AdFisher to probe the targeting of advertisements served up by Google on third party websites.

According to

a report

presented last week by the MIT Technology Review, the researchers found that fake web users believed by Google to be male job seekers were much more likely than equivalent female job seekers to be shown a pair of ads for high paying executive jobs when they visited a news website.

AdFisher showed that a Google

transparency tool

called "ads settings", which lets users view and edit the "interests" the company has inferred for them, does not always reflect potentially sensitive information being used to target the users.

Browsing sites aimed at people with substance abuse problems, for example, triggered a rash of ads for rehab programmes but there was no change to Google's transparency page.

The researchers said what caused these specific patterns is unclear because Google's ad-serving system is very complex.

Google uses its data to target ads but ad buyers can make some decisions about demographics of interest and also use their own data sources on people's online activity to do additional targeting for certain kinds of ads.

The examples do not breach any specific privacy rules - although Google policy forbids targeting on the basis of "health conditions".

Still, said Anupam Datta, an associate professor at Carnegie Mellon University who helped develop AdFisher, the research findings show the need for tools that uncover how online ad companies differentiate between people.

Google's algorithms had got the firm into hot water last week when an African-American man looked in his Google Photos collection and discovered an automatically generated album of him and his black female friend labelled "gorillas".

Google apologised

, saying it was 'appalled' that its new Photos app mistakenly labelled the black couple.