When Johana Burai, a graphic designer from Sweden, began to research pictures of "hands" a couple of years ago she was surprised to find that almost all of the images that showed up in her Google search were white.
Then she searched for "black hands" or "African hands", and found they tended to come with added subtext, such as a white hand reaching out to offer help, or the hands working in the earth.
As a result, she set up World White Web, which seeks to stop the search engine only showing images of white hands by encouraging people to link to and share images of nonwhite hands, in a bid to push them up Google's results. At the time of writing it has yielded one result.
The results Google generates are determined by the search engine's constantly evolving algorithm, which, the company states on its website, uses more than 200 different "clues" to work out what people might be looking for. The popularity of the image, how frequently it is shared, context such as text around the image, and meta-tagging all come into play.
This is why, Burai told BuzzFeed News, the project is not intended to illustrate racism on Google's part, but to highlight wider societal biases that are brought into relief by the algorithm: Results for other body parts and words like "man", "woman", and "child" produce overwhelmingly white results, when the majority of the world is not.
Burai said she didn't know what the company could do about it. "The people in society are creating Google, in a way," she said. "It's very easy to see stereotypes in Google. That tells us a lot about society. If you don't see problems like structural racism you can't call yourself an anti-racist."
Google did not give an official statement on the project, but sources at the company's London HQ told BuzzFeed News that they support Burai's work precisely because it illustrates these persistent issues. They said that the problem was with biases that exist within the media and on the internet, which the search engine's algorithm ends up reflecting.
In recent months, there have been a number of social media posts that have highlighted how pervasive the problem is.
For example, one Twitter user discovered last year that a Google image search for "beautiful dreadlocks" yields mostly results of white people with dreadlocks.
oh my lord. I can't understand. shit. I tried different search terms. i am…just..wow
Many people have pointed out that the search engine's image results also seem to reinforce Eurocentric beauty standards. Last year people noticed that searching "beauty" would produce hardly any results showing people of colour.
Recently a woman's tweet went viral after she discovered that a Google image search for "unprofessional hairstyles for work" yielded photographs of black women with natural hair, whereas searching "professional hairstyles for work" showed mainly white women.
I saw a tweet saying "Google unprofessional hairstyles for work". I did. Then I checked the 'professional' ones 🙃🙃🙃
Here the problem appears to be in part with the algorithm and in part with the context within which the pictures are presented.
The search simply picked up on captions and tags from around the internet that use the word "inappropriate" – as The Guardian has pointed out, the algorithm has actually taken many of the images of black women from blogs and articles that are "explicitly discussing and protesting against racist attitudes to hair".
For example, a few of the image results came from an article on Naturally Curly – a social platform dedicated to women with curly hair – that provided hairstyle tips to women in professions where curly hair might be perceived as "unkempt and unprofessional".
It's unclear, therefore, whether the algorithm is actually giving searchers what they're looking for: It seems likely that people searching this image term are more likely to be looking for examples of hair that is genuinely "unprofessional", rather than examples that are more controversial.
Specific search terms also appear to generate a racial bias as a result of the way that Western internet users carry out their searches. Whereas searching for "women" produces mainly stock images, the image results after searching for "Asian women" reveals mostly fetishised and sexualised images from porn and dating sites with Google suggestions of "Asian women vs white women" and "Asian women and white men".
Media bias also feeds into the results, as news reports about violent crimes are likely to be prioritised by the algorithm because they are widely clicked on and shared. A Twitter user has discovered that searching for "three white teenagers" showed results of mainly stock images of white teenagers looking happy, whereas a search for "three black teenagers" showed mugshots.
🤔What the hell is this Google @google
Many other internet users have noticed the racial bias in Google's image search autosuggestion feature, which is likely to be due to mass search habits. After typing "black people are" the top three autosuggestions are "crazy", "monkeys", and "rude". The first three results for "white people are" are mostly searches for the same meme.
Google sources are keen to point out that the company prides itself on its internal diversity, and that similar problems afflict search engines like Bing and Yahoo. They also spoke warmly about Getty's "Lean In Collection", which features thousands of stock photos of female leadership.
One employee said: "If organisations and media companies ensure what they are depicting is empowering and unbiased from the inside, in turn our algorithm will reflect this on the outside."
However, there are also problems that extend beyond basic searches. A 2013 study by Latanya Sweeney, a professor at Harvard University, found that advert results appeared to "expose 'racial bias in society'".
Sweeney found that names associated with black people lead to advertisements that said "Arrested?", and linked to websites that could perform criminal record checks. While she didn't specify the cause, she did single out the algorithm as a possible factor.
Last year it was revealed that a search for "nigga house" in Google Maps directed users to the White House.
If you Google Map "nigga house," this is what you'll find. America.
At the time Google said it was unsure how this had happened. Sources at the company told BuzzFeed News they believed it was linked to vandalism of the Map Maker app, which forced its suspension at around the same time last year.
However, a month later Google was forced to apologise after its new photo app – which uses artificial-intelligence software to organise a user's uploaded photos into categories – labelled two black people as "gorillas".
Google Photos, y'all fucked up. My friend's not a gorilla.
In a statement to the BBC a spokeswoman for Google apologised for the incident and said that "immediate action" would be taken to prevent this from happening again.
"There is still clearly a lot of work to do with automatic image labelling. We're looking at how we can prevent these types of mistakes from happening in the future," she added.
However, Jacky Alciné, who uploaded the photo, told the BBC that he was concerned about the kinds of images that Google had used in its initial priming that led to the app labelling two black people as gorillas.
"[Google has] mentioned a more intensified search into getting person of colour candidates through the door, but only time will tell if that'll happen," he said.