Joho the Blog » When the crowd is racist at Google

When the crowd is racist at Google

If you search Google Images for “Michelle Obama” (no quotes), the first image you’ll see is a poorly photoshopped picture of her as an ape.

You’ll also see a Google Ad on that page that links to Google’s explanation of why such a blatantly racist photo is the top-ranked one at Google Images. It says, after assuring us that Google does not endorse such images: “Search engines are a reflection of the content and information that is available on the Internet. A site’s ranking in Google’s search results relies heavily on computer algorithms using thousands of factors to calculate a page’s relevance to a given query.”

I have mixed feelings about this.

On the one hand, Google is taking a principled stand by not inserting its own political/cultural views into its engine. It’s also avoiding an endless squabble if it were to start hand-manipulating the results.

On the other hand:

1. Google’s algorithms are undoubtedly tuned by looking at the relevancy of the results. If they come up with a new wrinkle, they check it against the results it returns. So, the algorithms are already guided by Google’s own sense of what are good, useful and relevant results. If they tested a tweak of their ranking algorithm and it turned out always to put all the porn and pro-Nazi literature on top, Google would judge that algorithm as faulty. So, Google is already in the business of building algorithms that match its idea of what’s useful and relevant. When those algorithms occasionally turn up racist crap like that photo of Michelle, why not improve the algorithm’s results by intervening manually?

2. Google as a business and as a cultural force aims to give us useful results. That’s more important to the value of Google Search than the purity of its algorithmic approach. A photo of Michelle as an ape cannot reasonably be construed as the most useful result of a search for photos of her. So, fix it. (And, yes, I’d say the same if searches for “George W. Bush” ranked as first a photo of him as a chimp or as Hitler.)

Although the bulk of this post argues against Google’s position, let me say again that I am torn by this issue, and admire Google’s consistency and transparency about it.

18 Responses to “When the crowd is racist at Google”

  1. I think it can’t be fixed. Well – of course – it could. Google could tinker, work-around, meddle with their algorithms etc.

    But it is just not a solution. By avoiding the use of semantic technologies and relaying on methods that transmit social bias to thier metadata, Google reached the point where they must create milions of exceptions or risk the come back of the cases like today’s…

    I always thought, Google must go faster toward Web 3.0. What I see though, is their reluctance and avoidance of explicit and implicit semantic methods.

    I am not crazy enough to say – this caused Michelle picture as ape. But this is certainly one of its side effects….

  2. Interestingly I’m not seeing this image at all. Not as the first image, not on the first page, not anywhere. I’m based in the UK and looking at the .com version – perhaps that’s it?

  3. I think your point #2 is a good one. Also Google already filters its images with Safe Search, and…

    Ah, that might be it, Phil. If you have Safe Search = Moderate, that image comes up first; if you have Safe Search = Strict, it comes up second; if you have Safe Search = None, it gets crowded out by a whole lot of images that are perfectly safe and moreover non-racist. (Also there’s a terrible “related searches” line that comes up in Moderate and None though thankfully not in Strict.)

    So yeah, I reckon there’s some serious problems with the algorithms here. If I have Safe Search on, I want to be *at least* as safe from racist putrescence as from skin.

  4. OK.. sorted. If I turn moderation off I don’t see it. If I have moderation of images set as moderate or strict I do. How mad is that?

  5. I feel uncomfortable with the way Google has articulated its position on this one. But equally editorial is a never ending merry go round question.

    BTW Michelle Obama. I think you’re awesome.

  6. Under Moderate Safe Search and with Safe Search Off, I don’t see any offensive pictures in the first 4 pages of results. However, the “Offensive Search Results” Google Ad does appear on the first page of results, and there is a header saying, “Related Searches: michelle obama monkey“.

    Turning Safe Search to Strict turns up the offensive picture in the #2 slot on the first page.

  7. #2 is very important, however, I’m not sure if you’re correct when you say “A photo of Michelle as an ape cannot reasonably be construed as the most useful result of a search for photos of her.” It may be if your definition of “useful result” includes click-through as a major factor, which may in fact be one aspect of the metric Google uses to measure a result’s usefulness.

    It is on this note that I also disagree with your line in #1, “If they tested a tweak of their ranking algorithm and it turned out always to put all the porn and pro-Nazi literature on top, Google would judge that algorithm as faulty.”

    I am not sure if I want Google tweaking search results to better fit my ideals or anyone else’s. If their results didn’t satisfy their goal whether it be to provide relevant results proportionately to the queries or the query makers or some other similar goal, then yes, they should tweak their algorithm. But if their algorithm isn’t inherently prejudiced and it is simply the users that make it so, then Google might be performing mass social manipulation. I’m not going to cry “brainwashing”. It’s not that extreme. But I more than sympathize with Google’s way of dealing with this issue.

  8. Let’s play around with this whole concept a little further. Google says of the moderation filter:

    “Use Google’s SafeSearch filter if you don’t want to see sites that contain pornography, explicit sexual content, profanity, and other types of hate content in your Google search results.”

    One could therefore rightly expect there to be a considerable difference in the results displayed under the three conditions. With terms that refer to sexual activities or genital terms this is indeed exactly what happens. However, once we start searching for racially offensive epithets this does *not* happen. Admittedly the images are moved around on the screen but in the cases that I looked at, they remained substantially the same. I really would therefore have to question the extent to which Google regards ‘hate content’ as being of considerably less importance than sexual activity.

    With regards to this image I would have considerably more sympathy with Google if the image could only be seen if SafeSearch was off. That it’s the other way around is perverse. Google needs to accept that it’s algorithms in this instance have failed. If they are unable to tweak them to catch this and similar images then they should manually move them to a different level of moderation. This should not be difficult – once an image receives a certain number of objections it could be moved automatically. For Google to say ‘sorry, it’s the algorithm’ is nonsensical when it’s they who create that algorithm in the first place.

  9. David, I wonder how you would have had Google handle the cartoon of Jefferson and Sally Hemmings. It’s not exactly parallel, because it turned out 250+ years later to be a well-aimed insult, but by the standards of the time I’d say it was equally shocking.

    (I agree — weird the way the safe search switch does exactly the wrong thing here.)

  10. I’ve thought about this some more. It’s not a rule and rule book that Google needs. They need to take this particular picture down and do it immediately. They police their advertising in a top down hierarchical manner, they manage their search results in China. Google needs to do the right thing on this.

    No need to take a side. We all know what pornography is when we see it and this picture is unacceptable she is the first lady and in many respects more important as a woman than the President.

  11. Charles – you are right. As they manipulate already (as in China) for bad, they, in this case, should meddle, tinker with algorithms, whatever, for good – and remove this item from search list.

    But I stay on with my first opinion.
    As illustration, please go to http://www.hakia.com (beta of semantic search engine) and see the results there: link.
    Then add “ape” to the query and you will find what Google returned.

    I’m sure what Hakia returns is not specifically manipulated (for good).

    PS. I’m not related to Hakia and am not its hidden marketer :-)

  12. […] Obama” brings up a clearly racist image as the first result in a post entitled When the crowd is racist at Google. He writes that he is torn by this as he recognizes that, On the one hand, Google is taking a […]

  13. I don’t think there was a racist intention, although people who copied the image may have had those motives. That site ape-ifies everybody, regardless of race: http://celebrityape.com/

  14. I think SafeSearch is the problem, not Google image search. I – ALWAYS – have SafeSearch turned off (namely because I don’t trust Google to censor for me) and that image isn’t even on the first page.

  15. Yeah I think it should be treated as crappy search results. I don’t find the photo offensive and don’t think it should be banned or delisted, that would be horrifying. Search should reflect the web and some (a lot) of it is racist, explicit, etc.

    But point #2 is a good one. The George Bush image search has him compared to an ape, caricatured, photoshopped eating a kitten, giving the finger while holding the american flag and so on. And “george bush monkey” is suggested as a related search.

    Aside from the still from the video where he flips the bird, I’m not convinced those are actually the most popular photos of him. I’d assume a real aggregate top-20 would be mostly boring news or promo photos and if I’m searching for just his name without “parody” or “monkey” or “eating a kitten” or “having his bare ass spanked by the statue of liberty” that’s likely what I’m looking for.

  16. If Google were to do this job, why should it stop with images? There is plenty of nasty text that could turn up in search results. Surely we don’t want Google to reason that since it filters in China, it should filter in the US too!

  17. Harry, it already filters – that’s what a “Safe Search” is. They say themselves, “Use Google’s SafeSearch filter if you don’t want to see sites that contain pornography, explicit sexual content, profanity, and other types of hate content in your Google search results.” It’s just that it’s currently filtering *badly*.

  18. The hatred directed Obama is a good example of how the innate racism of some ordinary folk can be used against them by those of more manipulative bent.

Leave a Reply


Web Joho only

Comments (RSS).  RSS icon