Wrong Results: The Google Blacklist

by John Muse

On September 8th, 2010, Google added “Instant Search” to its search engine:

Start typing and results appear right before your eyes. Until now, you had to type a full search term, hit return, and hope for the right results. Now results appear instantly as you type, helping you see where you’re headed, every step of the way.

For example, head for “jane” and Google auto-completes the query. And gives you a few other options. And gives you search results. While you type. For example:

Now you know where you’re headed: to “jane lynch,” “jane austen,” etc. Or rather, you know where Google thinks you’re headed, i.e., where anyone who’s ever headed to “jane” has typically headed. They help you get there too. With everyone else. Unless you’re headed for “dick.” Once you type a space after the letter “k,” the results disappear.

There’s just “dick” and a cursor in a little white box. But hit return and Google returns to usual and sometimes unusual programming. And to be fair to “dick,” if you type “jesse” or “dar” after “jane” and a space, the results also disappear.

Soon after the launch of Google Instant Search, the digital edition of the magazine 2600: The Hacker Quarterly began to compile a blacklist, a list of terms that turn off Instant Search. Most of these terms appear to step into hazily bordered territories of what Google calls, when explaining their SafeSearch parameters, “pornography, explicit sexual content, profanity, and other types of hate content.” *

For example, “nude,” “swastika,” “murder,” and even “lsd,” which could have gone and still do go elsewhere with a return, now also resolutely go nowhere with Instant Search.

What follows are 2600’s blacklisted terms. The capital letter in each word is the letter that breaks the Instant Search results. Try to type “gringo,” and with the second “g” the results blank out, hence the blacklist entry, “grinGo.” But add an “e,” “a,” or “u” and new results emerge; “gringer,” “gringa,” and “gringuito” are just fine. The editors of the blacklist typically infer the don’t-go-there term based on the blackout letter—which sometimes requires guesswork: see “matU,” “pedoB,” and “upsK,” for example. There are omissions: e.g., “fleshB,” “foreP,” and “suiC” blank out Instant Search but aren’t yet included on the list. And there are errors as well: though on the list, neither “gay man” and a space, “kinkY,” nor “shanna katZ” blanks out. At least not anymore. Not today. The blacklist is less a complete and rigorous catalogue of blanks than a cabinet of easily datable and updatable curiosities.
The parenthetical remarks in the original 2600 page have only been edited for spelling and grammar. Some of these remarks are slyly informative: “taste my (a, c, d, f, m, o, p, and w all result in a blacklist—you figure out the rest).” Some remarks offer gleeful snarks: “adUlt (seriously!)” Some mock Google’s ostensible aims: “ecstasY (one of the more ineffective blacklists, as results for the drug are seen up until the last letter).” Others bring into focus a someone or something crafting the list: “rusty tRombone (we’re impressed—someone is really doing their homework).” For simplicity’s sake, thank you’s to contributors have been removed; see the original page for bracketed credits.

We include the blacklist in Sex Drive because Google shows us what we’re looking for and what we’re said to be looking for, even when we’re looking for something else. 2600 has given us an archive of blanks and thus a view into the negative space of our drives—but only those circumscribed by the search engine’s black-boxed heuristics. As compelling as the list may be—and hilarious, mysterious, sad, offensive, ridiculous, and/or hot—even more compelling is that sex, hate, violence, and drugs now happen in the search field and in the blank. Not only there, but there also.

Go to the Google Blacklist page

Download this essay with the blacklist (PDF)

Google’s “about SafeSearch” page helpfully declares, “Use Google’s SafeSearch filter if you don’t want to see sites that contain pornography, explicit sexual content, profanity, and other types of hate content in your Google search results. While no filter is 100 percent accurate, SafeSearch checks a website’s keywords and phrases, URLs, and Open Directory categories to determine and filter out inappropriate sites.” Note that the phrase, “other types” implies that the first three items in this list are hate speech as well. Samual Axon of Mashable quotes a Google spokesperson as follows: “There are a number of reasons you may not be seeing search queries for a particular topic. Among other things, we apply a narrow set of removal policies for pornography, violence, and hate speech… ¶… if the results for a particular query seem pornographic, our algorithms may remove that query from Autocomplete, even if the query itself wouldn’t otherwise violate our policies.” In other words, Google claims not to have a blacklist of terms; they restrict terms that find too much “unsafe” content as defined by their SafeSearch heuristics.