Clicky

Google manipulates search results after mass shootings, Google engineer reveals

It's just one of the many examples of Google changing the search results for political or social purposes.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

To say that manipulating search results is a slippery slope is an understatement. Once the technology and the method are in place, the possibilities to manipulate opinion and sentiment are truly endless.

From there on, it all depends on the integrity and even the goodwill of a company like Google not to abuse this power.

It’s not a secret that Google has long since abandoned the original premise of its search engine as simply a tool for users to discover what is organically the most relevant answer to their search term.

The algorithm determining search results is being constantly changed by the tech giant to make sure queries rank higher or lower for any number of reasons: to drive users toward ads, or toward pages powered by its AMP technology.

And lately, Google’s algorithm is being altered to recognize things like “hate speech” and push results judged to contain it down the rankings.

That might be a bitter pill to swallow for some users – but surely not if it’s framed as “tweaking” the search engine to tackle misinformation in the wake of tragic events?

And the way the Guardian frames its story about Google’s practices is that the world’s most dominant search provider simply “has to do it.”

The report cited a Pandu Nayak, a Google engineer, as saying that events such as mass shootings “presented an increasing challenge for the search engine to deliver accurate results.”

Without specifying what kind of misinformation he had in mind, Nayak revealed that the company was combating it by asserting its “authority” in the search algorithm.

The report explains that Google employs 16,000 “search quality raters” whose work is based on a 166-page “search quality evaluator guidelines.”

These raters proceed to introduce changes into the search algorithm so that it produces the result Google wants. The goal is twofold: for search results to relevantly answer the actual search term – and then, to also comply with “quality” standards as Google sees them.

That second part sounds awfully arbitrary: according to the report, the company determines not only the level of “quality” of search items, but also things like expertise, authoritativeness, trustworthiness, and reputation.

“Upsetting-offensive” is one of the flags at the disposal of the raters.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share