Google’s Gboard keyboard (which we really like), has a problem that you might not have expected from a keyboard app — profanity. It turns out that using certain innocuous words can bring up suggestions that you might not have been expecting.
It wouldn’t be a big deal if these suggestions were just irrelevant — that’d probably mean spending a few extra seconds typing, instead of relying on autocomplete to do the job. But what if the suggestions being offered by the app are what most people would find downright offensive?
A Gadgets 360 reader pointed out that the word ‘uske’ — a Hindi word that roughly means her/ his/ their — when typed in Gboard, would provide next-word suggestions that most would find inappropriate. When we tested this ourselves, we found that this was the case, getting the exact same suggestions, in the exact same order, as the person who had reached out to us.
The Gboard keyboard — like most modern keyboards — makes personalised suggestions based on your usage, but if you’ve never typed in Hindi (using the Roman script) before, and your keyboard is set to Gboard Multilingual Typing, with the language setting of English (India), and you type in ‘uske’, some of the suggestions refer to people’s body parts.
This happens even though the setting to block offensive words is switched on. Other suggestions, while relatively tame, might also offend people — for example, if you type in ‘aunty’ (and it’s a word you don’t use generally, so the suggestions aren’t personalised to you) then one of the suggestions is ‘hot’.
To find out what was going on, Gadgets 360 reached out to Google, but a representative of the company said its tech team was not able to reproduce the problem. The next day, the suggestions had changed — on our own phone and on three others that we tried — whether because of a change to the Gboard settings, or for some other reason is not known.
Google’s only response has been that it was not able to reproduce the experience, but that it came across a complaint of a similar nature that was fixed earlier. When asked if a hypothesis could be presented about why the suggestion was coming, we were told it would be looked into, but there has been no further response since.
If this was just one person’s experience, then the fact that Google’s tech teams weren’t able to reproduce the experience would make sense, however, two minutes of searching on the Internet — using Google search in fact — turned up a plethora of people posting about typing ‘uske’ and getting the exact same suggestions as well.
Aside from that, we also spoke to a number of Gboard users before reaching out to Google, and while the offensive suggestions weren’t always present, six out of the ten users we spoke to confirmed seeing what we did.
Anant and Shikha Saini, a couple that runs a cafe in Bengaluru, were two users on whose phone we were able to replicate the issue. «It felt sick, you know, I started to think, what sites have I been going to that Google thinks this is what I type?» Anant said. «When it showed up on Shikha’s phone as well, I felt a lot better about myself.»
In their case, we had them install the Gboard app on their devices for the first time before asking them to perform this test.
«I guess it speaks to the Indian mentality you know?» Shikha said. «It makes sense that so many people in the country are just so repressed and they’re typing these things, so why wouldn’t it come up?» Shikha’s referring to the fact the Gboard app crowdsources its recommendations from other users typing in the same language, if it doesn’t have any personalised suggestions to show for what you are typing.
Others pointed out that the Internet has much more disturbing content and that the rare chance of stumbling across a «mildly» offensive term as a keyboard suggestion wasn’t something to worry about.
«I mean, it’s like a fun trick you’ve showed me,» said Varun Dash, who works as a copywriter in Delhi, «but you know, if you’re worrying about this, then you should probably not have Chrome, or Facebook, or anything else on your phone. Just get the old phones, the dumb phones [feature phones] man.»
Ritika Bhatnagar, who has two school-going sons, was a little less relaxed about this.
«I wish you hadn’t shown me this because now I’m wondering what shows up on my kids phones,» she said. «We have to give them phones, but there’s all kinds of things on them and I don’t know what to do about it.» The good news for her is that this particular issue at least, seems to have been plugged.
Without any response from Google, it’s not possible to say exactly what’s happening here, but it’s not that hard to guess either.
A developer working on Indic keyboards at a Bengaluru-based company, who did not wish to be named since he’s not officially a spokesperson for his company, explained that suggestions are based on a number of different factors, including things such as your contact list, words that you type often, and their relationships with other words you use. But for scenarios where the dictionary hasn’t had a chance to be personalised for you, location-based suggestions that draw on how people are typing around you can make a big impact.
He further added that this is a simple problem to screen for, using lists to mark «safe» suggestions when there’s no history of the words being used.
It, therefore, looks like a scenario where Google was simply caught unawares by the users in India — though you would think Google would know better by now. That’s because this isn’t the first time Google has run into troubles with crowdsourced suggestions — and it’s a problem it seems to have made some headway with in other places.
Some years ago, YouTube India faced a similar scenario — if you weren’t signed in to your Google account, the home page was usually full of highly inappropriate content. Several people have shared screenshots of the kind of content that was synonymous with YouTube India at the time.
Today, the doesn’t seem to be the case, although the video suggestions that show up when you’re watching clips can still be a little strange — though such recommendations appear to be relatively few at this point.
It’s clear that Google relying purely on algorithms to determine what’s popular or trending in a region can surface content or ‘suggestions’ that are not appropriate for all users, or in some cases completely out of context.
For example, a few years ago, Google Image search showed the photo of PM Narendra Modi in results for the query «top 10 criminals». These results are based on algorithms that rely on a number of factors, and although it can seem like Google is «editorialising» or presenting an opinion, what’s happening is that those words have probably been used in a lot of places to describe the image in question, causing it to show up in Google search results. After that, some news sites picked up on it, and since these are high traffic, high trust sites, the image ranks even higher, until we got the situation where PM Modi’s photo was one of the top results for ‘top 10 criminals’.
The same thing would later result in JNU showing up as a result when searching for ‘anti-nationals’ on Google Maps. In these cases, it’s not a reflection of how people are using services like Gboard or YouTube, but it goes to show how Google’s algorithms can stumble from time to time.