Google has recently started implementing new measures to flag down “offensive or upsetting” content appearing on the search engine, The Guardian reported on Wednesday.
The main measure Google has taken to using is the employment of a group of 10,000 independent contractors, who have been described by the company as “quality raters” whose role is to assess the quality of the search engine’s systems. The contractors are tasked with making Google searches based on real queries, and are then asked to score the results based on whether or not they meet the needs of users.
While the company started using the contractors’ service back in 2013, the group has recently significantly expanded and a new element has been introduced to their activity- the aforementioned “upsetting-offensive” mark, officially updated on Tuesday.
According to The Guardian, one of the main reasons Google was prompted to carry out this initiative was a wave of complaints and criticism that was aimed at the company after publications such as The Guardian and The Observer published a series of articles revealing that queries such as “Did the Holocaust happen” led users of the search engine to top results suggesting misinformation, propaganda and hate speech.
In one particular incident that was widely discussed, the query “Did the Holocaust happen” led users to a white supremacist forum titled “Stormfront” which explained, among other things, how to promote Holocaust denial to others.
Google is now using this result (along with others) as an illustrative example in the training it provides its group of contractors with, teaching them how and when to mark pages as “upsetting-offensive.”
According to The Guardian, in the document Google handed out to contractors, the Holocaust-denying example was explained as follows: “This result is a discussion of how to convince others that the Holocaust never happened. Because of the direct relationship between Holocaust denial and antisemitism, many people would consider it offensive.”
While many were pleased to learn of Google’s attempts to keep close track of problematic search results, some are worried that the new measures don’t suffice.
Danny Sullivan, editor of Search Engine Land (a daily publication covering the search marketing industry), explained to The Guardian his and others’ reservations: “The results that quality raters flag is used as ‘training data’ for Google’s human coders who write search algorithms, as well as for its machine-learning systems… in other words, being flagged as ‘upsetting-offensive’ by a quality rater does not actually mean that a page or a site will be identified this way in Google’s actual search engine.”
Sullivan went on to explain that “instead, it’s data that Google uses so that its search algorithms can automatically spot pages generally that should be flagged.”
Google declined to comment on its new guidelines.