The search engine's parent company, Alphabet Inc., announced its plans to change a fundamental component of its algorithm that previously allowed false information to be sent straight to the top of its results pages.
We've all seen our fair share of auto-fill phrases going in a weird direction with just a few starting words, but some intentionally misleading ones can potentially divert a user away from more relevant search queries.
The technology giant said it would allow people to complain about misleading, inaccurate or hateful content in its autocomplete function, which pops up to suggest searches based on the first few characters typed. The American company has also updated guidance to its employees who evaluate the quality of results produced by...
Since time immemorial (1998) Google's bread, butter, and raison d'être has been search.
It added: "As part of that process, we have evaluators - real people who assess the quality of Google's search results - give us feedback on our experiments".
Fake news is, as Google defines it, "blatantly misleading, low quality, offensive or downright false information".
The problem, according to Google, is that 15 percent of all daily searches are brand new.
Fake news has been a trending term among tech companies in recent months, and Google is taking a more proactive approach toward the issue.
Google has also tweaked its search algorithms to ensure that "low-quality" content shows up lower in search results, which should minimize their reach.
They'll be making new changes to ranking to avoid that sort of thing in the future, and they're redoubling their efforts with Google's human oversight in determining quality of results.
The user feedback tools will be available for featured snippets, the boxes that appear near the top of search results and attempt to answer your query without you having to click through to a web page.
New, in-depth feedback tools are also being added to the search engine's auto-complete and featured snippets features so users can report offensive or inaccurate content. Just like human editors at traditional media outlets have to curate content and separate fact from fiction, Google has to do the same on a massive scale for all the stuff published to the web. The guidelines will now include options for identifying misleading information, unexpected offensive results, hoaxes, and "unsupported conspiracy theories".