Is Google allowing anti-Trump activists to game its algorithm so that U.S. President is returned as ‘idiot’ in its search results?

0
1639
 

Searches for “idiot” on Google Images return pictures of President Trump in the top five results, and heavily throughout the rest of the list.

According to Business Insider, “Anti-Trump activists are gaming Google’s algorithm so that when people search for ‘idiot,’ almost all of the top results are pictures of Donald Trump.”

The publication said:

“Anti-Trump activists are gaming Google’s algorithm so that when people search for “idiot,” almost all of the top results are pictures of Donald Trump. Protesters are publishing articles on their own platforms which associate the word “idiot” with Trump, as well as sharing and upvoting articles which do the same. The net effect of this is that the association inside Google’s algorithm becomes stronger, producing photos of Trump when people input the term “idiot.””

Trump recently stood up for Google when the European Union slapped a hefty $5 billion fine on the internet company.

Google drew the ire of the Californian Republican Party in May when voters in the Californian primary searching for “California Republicans” or “California Republican Party” found a result that listed “Nazism” as one of the party’s ideologies along with “Conservatism,” “Market liberalism,” “Fiscal conservatism” and “Green conservatism.”

Google insisted that the listing wasn’t put in place by humans but was a result of vandalized public information sources getting through systems meant to catch them. The information included in Google’s knowledge panels, like the one listing Nazism as an ideology of the California Republican Party, comes from public sources such as Wikipedia.

Google said then:

“Sometimes people vandalize public information sources like Wikipedia, which can impact the information that appears in search,” a Google spokesperson told Engadget. “We have systems in place that catch vandalism before it impacts search results, but occasionally errors get through, and that’s what happened here. This would have been fixed systematically once we processed the removal from Wikipedia, but when we noticed the vandalism we worked quickly to accelerate this process to remove the erroneous information.”