Home / Gadgets / Google’s autocomplete ban on politics has some problems

Google’s autocomplete ban on politics has some problems



Entering “donation” followed by the first letter of “Trump” or the candidate’s full name only resulted in the suggestion “donate trumpet”.

Google confirmed that these results violate its new auto-complete policy. “This was within our guidelines and our enforcement teams have taken action,” said a company spokesman on Friday. In subsequent tests, entering “Donate bid” only resulted in “Donate body to science”. If you type “Donate to bid”, you won’t see any auto-complete suggestions.

It’s unclear how many Google users may have seen the same pattern from WIRED as the company optimizes search results based on data about a computer̵

7;s location and previous activity.

Google’s new autocomplete policy and quick response to the obvious glitch show just how cautious the tech industry has become about politics.

During the 2016 presidential campaign, Google responded to allegations that autocomplete favored Hilary Clinton by pointing out that the feature simply couldn’t get it to favor a candidate or a cause. “Claims to the contrary simply misunderstand how autocomplete works,” the company told the Wall Street Journal in June 2016.

Entering “Donate Trump” did not trigger any searches related to the Trump campaign.

Screenshot: WIRED

Tech companies have become more humble – at least in public – since Donald Trump was elected. Revelations of political manipulation on Facebook during the 2016 campaign made it difficult for the social network and its rivals to pretend that juggling ones and zeros in apps had no impact on society or politics. Technology giants are now professing a deep sensitivity to society’s needs and promising that unexpected problems will be answered quickly.

This has made tech companies more reliant on – or aware of – human judgment. Facebook says it has gotten better at cleaning up hate speech thanks to breakthroughs in artificial intelligence technology that have enabled computers to better understand the meaning of text. Google claims similar technology made its search engine more powerful than ever. But algorithms still lag far behind humans when it comes to reading and other areas.

Google’s response to a second pattern that WIRED found during autocomplete shows the tricky judgments that can’t be passed down to computers. If you just enter “donate” in the search box, you will receive 10 mostly neutral suggestions, including “car”, “clothes near me” and “testicles”. The second entry was “on black life,” something that many Republicans identify as partisan opposition.

Google says this is not covered by the new auto-complete policy. “While it is an issue that has been politicized, this guideline is specifically about predictions that could be interpreted as allegations in support of or against political parties or candidates,” said the company spokesman.


More great WIRED stories


Source link