It seems like a lot of people are contending the free flow of information on the web these days. With an organism as powerful, vast and fast growing as the Internet I suppose it was bound to happen eventually. What’s scary though is how, if we’re not careful, in a few years time the Internet may no longer be an open pool of knowledge readily available to everyone but rather a small, limited experienced ‘tailored’ to own apparent needs, restricting us from easily seeing any information that will challenge our thinking.
SOPA is obviously a strong example of this, an attempt by the American government to dictate, restrict, protect and control content on the Internet and, although you could easily be forgiven for thinking that’s what I’m talking about here, I’m not. I am, actually and perhaps surprisingly, talking about search engines and social networks like Facebook, Yahoo! and Google – particularly Google – and how they, without informing us how or why, filter and disseminate the information they provide us.
If you haven’t seen it already, you should really take a moment and watch this wonderful video by Eli Pariser at TED on the phenomena described above, something he terms ‘filter bubbles’.
Whilst none of these examples are quite as controversial as SOPA, they are certainly far more subtle and just as impactful. Would it surprise you to know that Google filters every search result it serves you based upon a large set algorithms including your search history, connection with friends and their search history and likes and dislikes? Even when not logged in, they use, according to Eli, 57 factors to determine what search results you should see. Yahoo! and Facebook do similar things and, after experimenting with this myself for a bit, I was quite surprised at how radically results for the same search terms can differ.
Eli even gives a very powerful demonstration of this in his example about searching for the term ‘Egypt’ and how two of his friends received massively varying results – one saw search results about the civil unrest in Egypt, the other about holiday opportunities. Quite a shocking disparage really, one that makes you wonder just what information you’re not seeing when you search on Google. Are you actually getting the best info to help you? Or are you, in fact, getting results tailored to your previous searches, likes, dislikes and general, AI-determined, personality? And, if so, is that a good thing? Is it morally right? Is it even useful?
“A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.”
It all boils down to a battle between relevant interests and our need, if not arguable social responsibility, to broaden our horizons outside of our current social spheres. When the Internet was first born it was small enough to allow all sources of information on it to compete on an open playing field but now it’s becoming so large that it’s quite impossible to present us everything we ask for in any reasonable manner. This is Google’s reasoning behind filtering our information based upon our perceived personalities and social connections, the aim to disseminate information into chunks that are ultimately relevant to what they believe we like to see. Yahoo! News does the same thing as does Facebook, filtering their news feed based on algorithms that determine which of our friends we connect with the most.
And this is the issue. Information shouldn’t always be ‘relevant’ and especially shouldn’t be determined by, let’s face it, limited artificial intelligence. We need our views to be challenged, we need to be presented with information that is outside of our comfort zone and current thinking, not just to advance us socially but also to stop our brains stagnating into some sort of limited, inbred knowledge pool. Feeding us search results that Google thinks we want to see rather than ones we should see is both slightly backwards and contrary to the fundamental concept behind the web itself – the completely open and honest sharing of knowledge.
At the end of the day there is no perfect solution other than to ask search engines (and social networks to some degree) to be more clear and upfront about how and why they filter the information they serve and also to implore them to take into account the open principles of the Internet and our social and moral need to be exposed to information beyond our current interest and association. It’s the only way we will continue to develop as human beings and prevent the onset of contrived, closed thinking.