Google personalised searches and the 'echo chamber' effect?

By Grant Jacobs 30/07/2011 8

(Ruminating in the wee hours on google searches and echo chambers.)

The echo chamber effect describes where a person or group who repeatedly (mainly) only hears their own views echoed back to them.

My experience of this effect at work is what I have seen from groups touting anti-vaccine views, religious ideology, or supporting particular ‘alternative’ remedies, but it applies more widely.

One effect of repeating variants on the same answer can be re-enforcing of more extreme versions of the original information, which can then be taken as the accepted view by the group.

Echo chambers and advocacy groups

Advocacy groups of all kinds encourage their followers to favour their views over the views of others. They’re advocating a particular position, after all.

The better groups engage fairly with other views, particularly if they might learn something new from them.

Some groups, however, favour excluding alternative views entirely so that their group only ’hear’ what the group’s organiser wants the group to hear.

It’s particularly easy to exclude other views on-line, as most software supporting group discussion have features offering moderation or elimination of comments. The organiser is then free to remove comments that differ or oppose their own, leaving their group ’hearing’ their own voices re-affirming their existing beliefs. (By contrast, one of the key things about science is that it operates by testing ideas, not by seeking to shore them up.)

Search for better information

Ideally searching for information should prompt testing of entrenched ideas, provided reliable sources are identified and used. This brings up how to identify reliable sources – one reason I wrote Sources for medical information for non-medics and non-scientists. I’m not going to get into that topic, unfortunately, as I have a different axe to grind.

Distribution of IQ claims on google. Source from xkcd #715
Distribution of IQ claims on google. Source from xkcd #715

A common phrase suggesting another should search for information–if a little too often abused–is ’Google it.’

Google results are neither curated or vetted for reliability, but potentially it’s worse than that.

The google search personalisation feature may be creating a echo chamber effect to individual users.

Are google personalised searches creating an echo chamber effect?

A little while ago I was reading an article by Frank, Google: how to un-personalise your search, that introduced an article by Cyrus Shepard suggesting how to remove google’s ‘personalisation’ feature from it’s searches.

The personalised search feature aims to offer results based, in part, on the user’s past search results and accesses. It’s ‘on’ by default and, as Frank’s blog post points out, getting rid of it isn’t simple (for those who are not computer-geeks).

Describing the effect of the personalised search feature, Frank quotes Cyrus Shepard ’Every new search result starts to look like the search before. Our ideas become isolated and homogenized,’ […] (see his blog for the full quote).

I’d like to extend this observation a little into the direction the echo chamber effect.

At that time found myself attempting to offer correct information to a small number of people with views opposing vaccination on another forum. It’s maddening how they persist with inaccurate information, even after you offer more reliable sources.

I was left ruminating if google is worsening the problem by guiding people to sites they’ve previously used, articles with similar content, effectively erecting an echo chamber of sorts.

Instead of being, in part, a ’social media’ company is google encouraging a world of cliques with divisive, entrenched views?

Would it be socially responsible for google to have this feature off by default?

(There must be a thesis in this for someone willing to investigate this in depth – ?)

A related problem is localisation affecting, for example, what news google news offers you.[1] Here, results are based in part on where you are physically located. Related thoughts have been directed at the new social networking tool, Google+.

Philosophically, it feels like these solutions are favouring a feudalisation of information, with all but the savvy increasingly seeing information that is either, or both, geographically or topically ‘local’.[2]

Used as a way to relocate things previously found, or find things similar to that previously found it’s not bad – I appreciate it for that.

Nevertheless I miss ‘driving’ Altavista, making the search engine ferret out what I want. Personalisation is a ‘lazy user’ option that has it’s uses, but in principle impedes a proper search for fresh information.

You might quibble over the extent that search personalisation affects those with scientifically unsound views.[3] Other factors may dominate. Social pressures are one obvious influence, one you can readily observe if you visit (public) groups of, say, anti-vaccine advocates. There’s a lot of pressure for individuals in the group to toe the group line.


Cyrus’s article and the comments following it offer some solutions.[4] A problem I have with these is that these solutions involve being aware that there is a problem in the first instance and working around the default situation. I can’t imagine that more than a minority would do this.

One solution would be to use the Altavista search engine. Altavista was ‘the’ search engine before Google stole the show. (Before that internet directories–like Yahoo–reigned and before that gopher.) I suggest using the advanced search engine, which lets you tailor what you look for. Or perhaps Bing, but it doesn’t seem to offer ‘advanced search’ options.

My own suggested solution, however, would be for web browsers to offer in their search boxes the unpersonalised search by default or at the least some solution that clearly indicates that the search is biased by past record that can be switched off instantly.[5] The plug-ins Cyrus discusses are a step in this direction, but to reach the typical end user the solution needs to be built-in as they’re unlike to make the effort of installing the plugin.

Could Apple (Safari), Mozilla (Firefox), Opera, Microsoft and so on step up to the plate? (I’ve left the Chrome browser, developed by Google, out of this line-up.) Perhaps Opera can lead the way as it does so often in bringing new features to the web browser world?

An alternative approach might be for Google to offer an ‘ignore previous results’ option on the main page (that is, not buried down in advanced features or hacking the search URL).

Impact on science communication.

While this potentially has wide implications, my interests are with science communication.

Google isn’t doing us any favours if it encourages those with errant views to limit themselves to those erroneous views.

People have come to rely on google to show them what’s out there on-line. Showing people ’more like you’ve seen before’ isn’t the same as ’what’s out there’.


Loose thought: although we typically think of Google as a search + media (etc.) business you might view them as an advertising company, as that’s one source of their revenue.

1. Localisation. This is another issue, but  consider the case that you want to see what, say, consumers in the USA are being offered about the demise of News of the World. By default google news redirects me to the New Zealand site, stymying efforts to see what international readers elsewhere see.

2. Now there I am giving away a plot for a near-future sci-fi novel… search engines driving the world into divisive cliques… Scary, huh? An information-based counterpart to religious wars or other cultural conflicts. (I’m reminded of many years ago crossed the land border between Pakistan and India and noting the difference in newspaper reporting of events of the same days in the two countries. That border crossing was an interesting experience, but as sciblogs isn’t a travel blog collective, I probably shouldn’t go there.)

3. On a related note I’d like to test the extent personalisation sways google searches–as a scientist ought to–but I simply haven’t time.

4. Solutions to avoiding personalised searches. I’ve tried to manually add ‘&pws=0’ to the end of google searches seemingly without much success.

5. Safari 5.1, for example, will offer you the keywords you’ve previously used as you type into the search textbox, going some way towards relocating information you’ve look for before.

Other articles in Code for Life:

Should we teach examples of scientists falling for unscientific practices?

A course for all degrees: PHIL 105, Critical Thinking

Web browsers (part 1)

Web browsers (part not-quite 2)

Reproducible research and computational biology

8 Responses to “Google personalised searches and the 'echo chamber' effect?”

  • Great blog, Grant.

    I’m disappointed to admit that when Google started personalising my searches I never really thought about it, other than just thinking “this is useful”.
    Your concerns are very valid particularly with regards to science communication. I will have to seriously look at some of the suggestions you have made.



  • Thanks, Michael.

    It is useful in some ways, it’s just that when there’s also times when you definitely want an ‘unbiased’ search.

    I was reading earlier about Facebook, etc., proving a means for those with very odd views to find each other, where in a previous era they’d have gone a lifetime probably without ever meeting people with a similar viewpoint. (The article was about the bomber in Norway.) There’s both good and bad in the way these media bring people together too.

    (Hmm… I’m beginning to sound like Prof. Robert Lord Winston with his cautions about the bad side of otherwise good technologies!)

  • Hi, I just found your blog post. I’m actually doing my science communication thesis around this topic! I agree with all of the above and …the results I’m getting are not what I expected!

  • “3. On a related note I’d like to test the extent personalisation sways google searches—as a scientist ought to—but I simply haven’t time.”

    This is exactly what I’m testing and I’m looking at it in relation to the search for scientific information online, specifically testing for political bias on the part of the user according to their internet history.

  • Marie,

    Great to learn that someone is working on this. I notice in your tweets that you have an on-line survey. You’re welcome to post a link to it here if you think it’d help you. (I do it myself, but as this source will have a bias, I’d rather not disturb what you’re doing.)

    “the results I’m getting are not what I expected”

    Guess we’ll have to wait until your survey is complete, but I’d be interested to learn in what way your results aren’t what you expected.

    I didn’t realise Bing used a personalised search approach (as you indicate in your survey questions); I’ll try remember to update the article later today.

  • I wrote the article above almost two years ago now. Today as I was reading about the impending demise of Altavista, I was alerted to DuckDuckGo. I have to admit at first I took the name to be some sort of geek joke that was not yet in on. It proves to be an internet search engine offering anonymous searching with (claimed!) benefits of, among other things, avoiding the ‘filter bubble’, another term for what I described as the echo chamber effect. The wikipedia entry on filter bubbles includes accounts of others’ attempts to assess to what extent personalising a search affects the results. (I’ll assume they won’t suddenly get ripped out by a Wikipedia editor!)

    All worth thinking about.

Site Meter