A New Tool Shows How Google Results Vary Around the World
A Google spokesperson said the differences in results were not caused by censorship and that content about the Tiananmen Square massacre is available via Google Search in any language or locale setting. Touristy images win prominence in some cases, the spokesperson said, when the search engine detects an intent to travel, which is more likely for searchers closer to Beijing or typed in Chinese. Searching for Tiananmen Square from Thailand or the US using Google’s Chinese language setting also prompts recent, clean images of the historic site.
“We localize results to your preferred region and language so you can quickly access the most reliable information,” the spokesperson said. Google users can tune their own results by adjusting their location setting and language.
The Search Atlas collaborators also built maps and visualizations showing how search results can differ around the globe. One shows how searching for images of “God” yields bearded Christian imagery in Europe and the Americas, images of Buddha in some Asian countries, and Arabic script for Allah in the Persian Gulf and northeast Africa. The Google spokesperson said the results reflect how its translation service converts the English term “God” into words with more specific meanings for some languages, such as Allah in Arabic.
Other information borders charted by the researchers don’t map straightforwardly onto national or language boundaries. Results for “how to combat climate change” tend to divide island nations and countries on continents. In European countries such as Germany, the most common words in Google’s results related to policy measures such as energy conservation and international accords; for islands such as Mauritius and the Philippines, results were more likely to cite the enormity and immediacy of the threat of a changing climate, or harms such as sea level rise.
Search Atlas was presented last month at the academic conference Designing Interactive Systems; its creators are testing a private beta of the service and considering how to widen access to it.
Search Atlas can’t reveal why different versions of Google portray the world differently. The company’s lucrative ranking systems are closely held, and the company says little about how it tunes results based on geography, language, or a person’s activity.
Whatever the exact reason Google shows—or doesn’t show—particular results, they have a power too easily overlooked, says Search Atlas cocreator Ye. “People ask search engines things they would never ask a person, and the things they happen to see in Google’s results can change their lives,” Ye says. “It could be ‘How do I get an abortion?’ restaurants near you, or how you vote, or get a vaccine.”
WIRED’s own experiments showed how people in neighboring countries could be steered by Google to very different information on a hot topic. When WIRED queried Search Atlas about the ongoing war in Ethiopia’s Tigray region, Google’s Ethiopia edition pointed to Facebook pages and blogs that criticized Western diplomatic pressure to deescalate the conflict, suggesting that the US and others were trying to weaken Ethiopia. Results for neighboring Kenya, and the US version of Google, more prominently featured explanatory news coverage from sources such as the BBC and The New York Times.
Ochigame and Ye are not the first to point out that search engines aren’t neutral actors. Their project was partly inspired by the work of Safiya Noble, cofounder and codirector of UCLA’s Center for Critical Internet Inquiry. Her 2018 book Algorithms of Oppression explored how Google searches using words such as “Black” or “Hispanic” produced results reflecting and reinforcing societal biases against certain marginalized people.
Noble says the project could provide a way to explain the true nature of search engines to a broader audience. “It’s very difficult to make visible the ways search engines are not democratic,” she says.
Source link