Google Fixed Its Algorithm So That Lesbian-Related Searches Are Less Pornographic

By Melanie Ehrenkranz on at

Google is one of the most powerful and popular search engines, but that oftentimes doesn’t conflate to being free from flawed results. And when a French news site and Twitter account campaigned against the sexualised search results for the word “lesbienne,” Google claimed that it fixed its algorithm.

In June, French news site Numerana found that Google removed its Pride Month banner for “lesbienne” searches, but not for other related French terms for transgender and queer. It also pointed out that search results for the term “lesbienne” surfaced pornographic content ahead of any educational or informative content, and this was also not the case for French terms for transgender and queer, which led with credible, informative links.

“I find that these [search] results are terrible, there is no doubt about it,” Google’s vice president of search engine quality Pandu Nayak told Numerama, according to PinkNews. “We are aware that there are problems like this, in many languages ​​and different researches. We have developed algorithms to improve this research, one after the other.”

A Google spokesperson told Gizmodo that it “developed an algorithmic solution so that we can deliver high-quality results not just for that query, but entire classes of queries.” They added that these changes didn’t just affect search results for “lesbienne,” but for a wider breadth of search queries. The spokesperson also pointed out that this issue and fix was unique to the French term “lesbienne” and not the term “lesbian” in English. Google made over 3,200 improvements to search in the last year for a number of reasons, one of which was to deal with issues in which its systems aren’t operating as planned, according to the spokesperson.

“We work hard to prevent potentially shocking or offensive content from rising high in search results if users are not explicitly seeking that content,” the Google spokesperson told Gizmodo in a statement. “As we said at the time, we recognise that our results for the query ‘lesbienne’ in French were falling short.”

This is an issue that has plagued the search engine for years, with untrustworthy links slipping into those sweet top spots to further peddle lies and conspiracy theories. It has happened multiple times in the immediate aftermath of a mass shooting, and when people searched for information around the Holocaust and climate change, the algorithm spoon-fed them dangerously false information via the Top Stories module. For a tool largely ruled by an algorithm, results like these serve to remind us that they, like humans, are not immune to bias and misinformation.

And for those who are inclined to confirmation bias, or having blind faith in their search engines, or both, prioritising false or sexualised information is not only a disservice to the truth, but perpetuates harmful notions about already vulnerable communities.

Featured image: Getty

Tags: