Strong Silent Types: Evil Robots and Their Way with Words

The Filter Bubble is Getting Stronger Every Day, and You Still Don’t Know You’re in it

This week, Dazed & Confused's online editor asked me to write a speculative piece on the future of the web. The ideas and predictions I came up with ranged from immersive telepresence to Internet architecture being blasted into outer space.

But the one element I discussed which has got most of my friends and peers talking is the idea of deep and extensive content personalisation. That is, the logical future extension of what Eli Pariser discussed in his 2011 book, The Filter Bubble: What the Internet is Hiding From You.

Watching Pariser's TED talk on the same subject was one of the most mind-blowing moments of that year for me. In a short, nine minute clip (below), Pariser stated a simple yet earth-shattering fact: Google changed the search results you got - radically - based on what it knew about you. And depending on the sort of person Google thought you were, a search term like "Egypt" could bring up a string of links about anti-government protests, or nothing about those events at all.

And the really insane thing was that this just happened without people realising. You didn't have to "opt-in" to tailored search results, you just got them. It wasn't even as if Google just analysed the things you had plus-one'd on your Google+ profile when they altered searches. At the time of Pariser's talk, Google+ hadn't even launched yet.

The whole point about Filter Bubble fear is precisely that we likely don't know when we're being fed personalised information and when we're not. The chief reason that Filter Bubbles are used by companies like Facebook and Google is because they know that if they feed you the "right" stuff, you spend more time on their sites. You click on more links. You look at more ads.

You make them more money.

It's a simple formula, but it's that logic which rests behind the fact that the increasing complexity and prevalence of filter bubbles is propelled by powerful commercial incentives. Why wouldn't the tech giants do this stuff? We, collectively, are out there using their services all day every day, proving that it's incredibly effective.

As Pariser noted in his book: "You don't get to create your world on your own. You live in an equilibrium between your own desires and what the market will bear. And while in many cases this provides for healthier, happier lives, it also provides for the commercialisation of everything [...]."

While that assertion remains true (there's no sign of it being upturned any time soon), the limits of clandestine personalisation are only the limits possessed by technology itself. In order to tailor content for a web user, you need two things. Data that identifies them in detail and algorithms which quietly and efficiently use that data to alter what the user sees when they visit a given site.

Eli Pariser

In June, Yahoo! News re-launched their homepage. As part of that re-launch they announced an increased focus on "personalisation." The content of stories still - as far as I can tell - remains the same for all users, but the weighting can be heavily customised.

Mike Kerns, Product VP at Yahoo! said in a blog post in June, "Yahoo! News will get smarter over time - the more you use it when signed in with your Yahoo! ID, the more it learns about your preferences, creating a personal news hub just for you."

In addition, a friend alerted me to Gawker's plan to allow readers to edit headlines themselves. Although this at least necessitates active involvement from an audience, the continued prominence of personalisation in general is likely to encourage the arrival of yet more covert examples.

When I brought up the idea of an online newspaper that silently alters the very wording of articles in order to appeal to a particular reader's politics in my piece for Dazed and later on Twitter, I got some comments which doubted that it was likely to happen so seamlessly. 



But as I pointed out in reply, content personalisation is already happening on a large scale and we never question it. You didn't quit Google two years ago, did you?

One of the chilling points made by Pariser in his Filter Bubble book is that constant and unasked for personalisation essentially leads to, "a static conception of personhood." And, again, we all like to believe that we don't see our personhood as static, but we all constantly fall into the trap of maintaining that stasis.

A few weeks ago I had something of a rant on Twitter in which I descried the practise of unfollowing those whose views you disagree with. I had just met someone, in fact, who told me a story about making friends with a guy she later found out had right-wing views which jarred sharply with her own politics. "I am still friends with him on Facebook," she said, "because I think it's good to sometimes expose yourself to ideas that challenge your worldview."

Where is the infrastructure to support that kind of attitude online? The recommendations, suggested likes and links, filtering itself and procedures for following and unfollowing are all geared towards the opposite mantra: "I'm going to focus - solely - on the stuff that gratifies my own interests and desires." At the time of my rant, I called it "The Elective Filter Bubble."

By definition, that's not something that has been pushed onto us, it's something we exhibit naturally and encourage. Indeed, it's just a demand which is now meeting new bounties of supply.

So, knowing that we, as a species, are inclined this way, is it any wonder that the Filter Bubble exists? Or that it is becoming more powerful with each new app update and data stream?

No. But it's been two years since Eli Pariser's book came out, and I still don't think we're asking the right questions yet.


Photo: "1963... lady, bubble, river - Melvin Sokolovsky" by James Vaughan. Reproduced under a Creative Commons (CC) License.

Photo: "Eli Pariser - PopTech 2010 - Camden, Maine" by PopTech. Reproduced under a Creative Commons (CC) License.


comments powered by Disqus





The Machine Starts is all over Twitter! Click here to check out a flurry of observations on digital culture and links to the latest and most interesting tech news from around the web.


The Interface and Hyperreality

Interfaces express not that a journey has been eliminated, but that a new one may be created.

Predicting a Riot: What Violence Means for "Society"

Networking, in many senses, gives rise to a new perspective on the London Riots of 2011.

Could you quit the internet?

Does abstinence from the web ever last? Is it even a good idea?

The Computer Virus: Our Cultural Contagion

Computer viruses are not just computer viruses. They spread in pathological as well as technological ways.