Jewish museum accuses Google of profiting from Holocaust denial
Google has said it's decided to make considerable changes to its algorithm to deal with the issue of offensive auto-complete answers to search results Reuters

Google, one of the most powerful technology companies in the world, is facing renewed criticism for displaying shockingly offensive auto-complete results in its search engine, including "the KKK is not racist", "Black Lives Matter is cancer" and "Nazis are cool".

Last December, Google was steeped in controversy after The Guardian investigated its autocomplete algorithms and found that some open-ended search terms were promoting conspiracies, misinformation and, in one case, holocaust denial.

Now, a month later, Google is still having trouble with how its search terms are displayed. As noted by the Daily Dot, a Twitter used called Sarah Kendzioruser recently posted numerous examples of how the problem persists. Typing "Adolf Hitler is" brought up "a genius".

In the wake of the initial bad press, Google said it would be rethinking how its auto-complete function worked.

Indeed, it successfully removed some of the offensive links to white supremacist websites that were appearing on top of its results page.

However, as viewed by IBTimes UK, the auto-complete function is still bringing up a slew of unhelpful terms in real-time. Using Google's search bar to type "Black Lives Matter is" brought up four results: "cancer", "a lie", "bad" and "trash".

Testing the search algorithm for "Adolf Hitler is" brought up: "back", "a genius" and "stressed out". Typing "Climate Change" resulted in "a hoax", "not real" and "a myth". Meanwhile, an open-ended search for "Nazis are" leaves you with "cool", "left wing" and "good".

Even now, searching for "Holocaust is" brings up the words "lie", "hoax", "Israel" and "conspiracy".

A Google spokesperson said: "We've received a lot of questions about autocomplete, and we want to help people understand how it works: Auto-complete predictions are algorithmically generated based on users' search activity and interests.

"Users search for such a wide range of material on the web – 15% of searches we see every day are new. Because of this, terms that appear in auto-complete may be unexpected or unpleasant.

"We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we don't always get it right. Auto-complete isn't an exact science and we're always working to improve our algorithms."

This is not the first time Google has faced questions over its auto-complete results. Last year, amid rising tensions due to the US presidential election – and the rise of so-called 'fake news' – the tech giant was forced to respond to accusations its results were being rigged. It was alleged the firm was manipulating searches to hide any negative news stories about Democratic candidate Hillary Clinton – an accusation it strongly denied.

"Auto-complete isn't an exact science, and the output of the prediction algorithms changes frequently," wrote Tamar Yehoshua, product management director of search at Google in a blog post published on 10 June last year.

"Predictions are produced based on a number of factors including the popularity and freshness of search terms," she continued. "Given that search activity varies, the terms that appears in auto-complete for you may change over time.

"It's also important to keep in mind that autocomplete predictions aren't search results and don't limit what you can search for.

BLM screenshot
Google results for 'Black Lives Matter is' Screenshot/IBTimes UK

"It's a shortcut for those who are interested. You can still perform whatever search you want to, and of course, regardless of what you search for, we always strive to deliver the most relevant results from across the web."

Google's algorithms are, in a broad sense, bolstered by the searches of its users. Its internal systems remain obtuse, secretive and constantly in a state of flux.

Yet in spite of this, its critics argue the firm should be doing more to police its search engine's autocomplete function.

"It's horrible," Danny Sullivan, founding editor of SearchEngineLand.com, told The Guardian last December. "It's the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate.

"Google is doing a horrible, horrible job of delivering answers here. It can and should do better."