We have found ourselves in a paradox. The internet, champion both of giving and sharing knowledge, is simultaneously keeping it from us. How? By manipulating our search results - and ultimately how our opinions and ideas are formed.
Eli Pariser, chief executive of Upworthy, identifies this as the ‘filter bubble’. Algorithms used by the Googles and Facebooks of the world filter results based on our search history, previous dwell time, location and even gender. In other words, controlling what we’re exposed to, deciding on our behalf and often without our knowledge.
While this seems perfectly logical - the internet is, after all, a plethora of content that can be difficult to navigate and make sense of - I feel a sense of chaos is essential. When we start to control the internet’s content, we compromise its inherent idea of discovery and our influences begin to be manipulated.
Parisher demonstrates the glaring consequences of this in his TED talk by comparing the search results of two friends Googling Egypt. One friend’s results were focused on holiday and tourist information, while news surrounding the revolution took precedence for the other.
What’s worth noting is that the experiment was conducted at a time when the revolution was the focus of world news. For the first friend, Google’s algorithms failed to understand the context or timing of the results, making those results flawed.
As far as we know, filtering like this is currently happening on a small scale but the implications remain.
If you live in a filter bubble all the time, when will you be exposed to opinions or content that challenges and inspires you? Working at a digital strategy and creative agency, ideas are what differentiate us. How will we continue to be innovative if we’re constantly exposed to the same tailored influences?
"Ideas occur when dissimilar universes collide." (Seth Godin)
Manuel Lima, senior UX design lead at Microsoft, argues in his lecture The Power of Networks that having a small understanding of a broad range of topics is far superior to detailed knowledge of one area. Broad knowledge networks are what stimulate the collisions of universes, sparking ideas. And as we all know, the definition of thinking creatively is being able to form connections between seemingly different pieces of information that others might miss.
In their efforts to make our results more personal, not only do these algorithms filter out topics that broaden our knowledge networks, they have the potential to stifle our ability to think creatively.
"Making the most of data, without becoming enslaved by it." (Ajaz Ahmed, Velocity)
So what’s the impact on marketing? Well, while the filter bubble potentially provides the solution to pin-sharp targeting, it can also make it difficult to attract new customers.
Once people go into their bubble, it can be hard for brands or products to make a connection if the algorithms decide - rightly or wrongly - that the customer won’t be interested. And that’s before you start to consider the power an enormous brand like Coca-Cola could wield if it decided to follow people at every step of their lives, using its financial might to block out all others.
An exaggeration? Maybe. But as anyone who’s had an item they viewed torment them all over the internet will testify, it feels like we possess little control over how and when these algorithms operate. Currently, they feel as though they’ve been created solely to make banner ads more targeted rather than insightful.
In effect, our inability to control the filter bubble means we become enslaved by it. A discussion with colleagues revealed that none of us were prepared to remove our cookies, but neither did we want to be controlled by them. Why should it be this all or nothing situation?
Firstly, we need to be made aware of when our results are being filtered. Simply knowing when the bubble is operating will give us the power to use it at a time most beneficial to us.
For example, if you’re searching for gift ideas, it’s not useful to get suggestions tailored to your own tastes. Secondly, we need to be able to manipulate the levels of filtering and reintroduce the feeling of serendipity to search.
This means serving up content that challenges our opinions - even using those algorithms to find the most opposing view related to our search.
People should always be actively encouraged to indulge in different tastes and interests as and when they feel like it. This freedom is fundamental to the success of business models such as Spotify and LoveFilm.
These brands have given consumers the platform for limitless discovery. In the same way, search results and social networks should give consumers the freedom to keep discovering.