October 16, 2014

Usability and ease are killing the open web – Adapting to the Filter Bubble – Algorithms vs. Humanity

I held a talk last night at Hyper Island in Singapore. I had been left with a bad feeling in my stomach over what I have been doing to the web the past few years. My job is to figure out how the mechanics of online algorithms and networks work, and then use them, sometimes abuse them, in order to get marketing results for brands.

It sounds horrible when I say it that way, but it is the way it is. By understanding how networks work, human behaviour, and the algorithms that govern these behaviours, I have been able to produce some fantastic results. Modern day marketing. Growth hacking. Viral marketing. It all builds on the same notion. Just as old school creative, emotional advertising hacked the emotions we guide ourselves with.

My presentation yesterday was about algorithms and how they are filtering our worlds based on what we like. The frictionless web, where we enjoy shit so much that we never have to get disappointed. The counter side of this, is that we do not see things that oppose our current world beliefs. We are only fed with things that confirm our own beliefs.

What I didn’t know at the time, was that a person – Eli Pariser (thnx for the tip @infotology) – had thought about the exact same thing, but three years ago. He had given it a nice name. The “filter bubble”.

What Eli predicted, is now coming true

However, he saw the first sign and predicted that this would have an impact. Today, we actually start to see how algorithms are effecting everything from journalism to politics. Real shit, that effects us on a very broad scale.

Question such as, why racism is growing in Europe, why some people don’t believe in the effects of our pollution on global warming, why IS can reach their intended audience, without being detected. The filtered web confirms bad ideas too. And so, if the algorithm sense that you are into something such as racism, it will start showing you more racist content, confirming your idea that the racist thing is the way forward. Thus cultivating your idea, into a strong belief.

And I truly believe that the pleasing algorithms, demoting opposing opinions, is a big part of this.

The worst thing is that main stream media. The old gate keepers of information are starting to adapt to the world of likes, producing their clickbaiting, like hunting information types in the shape of headlining their websites with gossip, entertainment and buzzfeed tilted headlines.

Yes. I have a bit of a foil hat on when I discuss these things, but as a hypothesis, it would be interesting to look deeper into it.

Why usability is killing the web

In a sense. No one dislikes this development.

The technology companies producing the services, get more popular if they adapt their algorithms to show things people like. The people receiving the algo-edited information, likes it more than having to scroll through things that are “irrelevant” to the urge for solving a quick question. I love it, as I can predict how to work with information in order to get into the news feed of a particular person, how to get on someones radar.

No one really has an incentive to change things. We all have an incentive to keep this going. However, my question right now is, do we have a responsibility to do something about it?

0 Comments

Leave A Comment

Leave a Reply