Claims That Google's Search Algorithm Spread False Information

Google’s search algorithm appears to be systematically promoting information that is either false or slanted with an extreme rightwing bias on subjects as varied as climate change and homosexuality.

Following a recent investigation by the UK=Published  Observer newspaper, which found that Google’s search engine prominently suggests neo-Nazi websites and anti-Semitic writing, the Guardian has uncovered a dozen additional examples of biased search results.

Google’s search algorithm and its autocomplete function prioritize websites that, for example, declare that climate change is a hoax, being gay is a sin, and the Sandy Hook mass shooting never happened.

The increased scrutiny on the algorithms of Google, which removed anti-Semitic and sexist autocomplete phrases after the recent Observer investigation, comes at a time of tense debate surrounding the role of fake news in building support for conservative political leaders, particularly US president-elect Donald Trump.

Facebook has faced significant backlash for its role in enabling widespread dissemination of misinformation, and data scientists and communication experts have argued that rightwing groups have found creative ways to manipulate social media trends and search algorithms.

The Guardian’s latest findings further suggest that Google’s searches are contributing to the problem.

In the past, when a journalist or academic exposes one of these algorithmic hiccups, humans at Google quietly make manual adjustments in a process that’s neither transparent nor accountable.

At the same time, politically motivated third parties including the “alt-right”, a far-right movement in the US, use a variety of techniques to trick the algorithm and push propaganda and misinformation higher up Google’s search rankings.

These insidious manipulations, both by Google and by third parties trying to game the system, impact how users of the search engine perceive the world, even influencing the way they vote. This has led some researchers to study Google’s role in the presidential election in the same way that they have scrutinized Facebook.

Robert Epstein from the American Institute for Behavioral Research and Technology has spent four years trying to reverse engineer Google’s search algorithms. He believes, based on systematic research, that Google has the power to rig elections through something he calls the search engine manipulation effect (SEME).

Epstein conducted five experiments in two countries to find that biased rankings in search results can shift the opinions of undecided voters. If Google tweaks its algorithm to show more positive search results for a candidate, the searcher may form a more positive opinion of that candidate.

In September 2016, Epstein released findings, published through Russian news agency Sputnik News, that indicated Google had suppressed negative autocomplete search results relating to Hillary Clinton.

Even changing the order in which certain search terms appear in the autocompleted list can make a huge impact, with the first result drawing the most clicks, he said.

At the time, Google said the autocomplete algorithm was designed to omit disparaging or offensive terms associated with individuals’ names but that it wasn’t an “exact science”.

Then there’s the secret recipe of factors that feed into the algorithm Google uses to determine a web page’s importance, embedded with the biases of the humans who programmed it. These factors include how many and which other websites link to a page, how much traffic it receives, and how often a page is updated. People who are very active politically are typically the most partisan, which means that extremist views peddled actively on blogs and fringe media sites get elevated in the search ranking.

“These platforms are structured in such a way that they are allowing and enabling, consciously or unconsciously, more extreme views to dominate,” said Martin Moore from Kings College London’s Centre for the Study of Media, Communication and Power.

Epstein wants Google to be more transparent about how and when it manually manipulates the algorithm.

“They are constantly making these adjustments. It’s absurd for them to say everything is automated,” he said. Manual removals from autocomplete include “are jews evil” and “are women evil”. Google has also altered its results so when someone searches for ways to kill themselves they are shown a suicide helpline.

Shortly after Epstein released his research indicating the suppression of negative autocomplete search results relating to Clinton, which he credits to close ties between the Clinton campaign and Google, the search engine appeared to pull back from such censorship, he said. This, he argued, allowed for a flood of pro-Trump, anti-Clinton content (including fake news), some of which was created in retaliation to bubble to the top.

“If I had to do it over again I would not have released those data. There is some indication that they had an impact that was detrimental to Hillary Clinton, which was never my intention.”

The problem has become particularly challenging for Google in a post-truth era, where white supremacist websites may have the same indicator of “trustworthiness” in the eyes of Google as other websites high in the page rank.

“What does Google do when the lies aren’t the outliers any-more?” Heller said. “Previously there was the assumption that everything on the internet had a glimmer of truth about it. With the phenomenon of fake news and media hacking, that may be changing.”

A Google spokeswoman said in a statement: “We’ve received a lot of questions about autocomplete, and we want to help people understand how it works: Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for such a wide range of material on the web, 15% of searches we see every day are new.

Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we don’t always get it right. Autocomplete isn’t an exact science and we’re always working to improve our algorithms.”

Guardian:          On Facebook, Fake US Election News Was More Popular Than Real News:

 

« Cyber Deterrence: How To Curb Cyber Attacks
Smartwatch Technology For Police Forces »

Directory of Suppliers

SSLGuru

SSLGuru

SSLGURU provide tools and services to manage SSL certificates and to create higher level of online security.

BakerHostetler

BakerHostetler

BakerHostetler is one of the largest law firms in the USA We have five core practice groups including a specialty practice team in Privacy and Data Protection.

Rollbar

Rollbar

Rollbar is a full-stack error monitoring platform for web and mobile applications. We help developers find and fix bugs fast. Built by developers for developers.

NaviSite

NaviSite

NaviSite is a leading worldwide provider of enterprise-class, cloud-enabled hosting, managed applications and services.

Security Industry Association (SIA)

Security Industry Association (SIA)

The SIA's mission is to be a catalyst for success​ within the global security industry through information, insight and influence.

Redcentric

Redcentric

Redcentric is a leading UK IT managed services provider. We deliver managed IT, cloud computing, data backup, information security services and managed networks.

Code Dx

Code Dx

Code Dx is a software application vulnerability correlation and management system.

Cybereason

Cybereason

Cybereason provides real-time detection of malicious activity enabling you to identify the cause and scope of an attack and ensure an effective response.

eco

eco

eco, with more than 950 member organizations, is the largest Internet industry association in Europe.

Efecte

Efecte

Efecte is a Nordic SaaS company specialized in IT Service Management, Self-Service, Identity Management and Access Governance solutions.

Guardian360

Guardian360

The Guardian360 platform offers unrivalled insight into the security of your applications and IT infrastructure.

Cyber Insurance News

Cyber Insurance News

Cyber Insurance News provides daily and weekly cyber insurance and network security industry news.

TechDefence Labs

TechDefence Labs

TechDefence Labs provide pentesting and security assessment services for networks, web apps, mobile apps and source code reviews.

Beta Systems

Beta Systems

The Identity Access Management solutions of Beta Systems comply with the vision of a strong provisioning foundation combined with state-of-the-art governance and analytics applications.

Fornetix

Fornetix

Key Orchestration by Fornetix is an advanced encryption key management ecosystem that automates the key lifecycle across the entire enterprise with groundbreaking precision and speed.