Programmatic myopia

I recently saw a frightening TED talk by Eli Pariser.

Eli is a progressive, liberal thinker.

But he likes to expose himself to what conservatives are thinking.

So his Facebook feed included input from both sides.

Then, one day, he noticed there was no input from conservatives – it had simply dried up.

His feed was just filled with input from other progressive thinkers.

Eli wondered what had happened.

Had all the conservative thinkers stopped posting?

Had most of the world come around to his liberal views?

But Eli found out that wasn’t the case.

Conservatives were posting as much as ever, but Facebook decided he shouldn’t see their posts, so it filtered them out.

Facebook has an algorithm that tailors everyone’s feed to what it thinks our preferences are.

It does this by measuring what we click on most.

Then it maximises that.

We don’t decide what gets in.

We don’t decide what gets left out.

We don’t even know it’s happening because we don’t get asked.

We just assume that what we see is a fair representation of everything that’s happening in the world.

We don’t know that an algorithm has decided what we should and shouldn’t see.

Before we even click, Google uses 57 different measurements to tailor our content. 

Including what computer we’re using, what browser we’re on, what location we’re in.

That’s 57 measurements before we even start clicking.

Eric Schmidt, executive chairman of Google’s parent company, said: "It will soon be very hard for people to watch or consume something that has not in some sense been tailored for them."

And it’s not just Facebook and Google, it’s Yahoo News, The Huffington Post, The Washington Post, Amazon, Netflix, Flipboard, everyone.

A journalist asked Mark Zuckerberg why they did this. 

He said: "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa."

And an algorithm will decide that for you.

The main problem for me is choice.

When I had a "favourites" button on Sky TV, it worked fine because I chose the favourites.

I knew there were other choices if I wanted them but, with Facebook and Google, I don’t know that.

Because I’m not consulted.

An algorithm does it without even asking me.

So I think I’m getting all the information there is when actually I’m only getting a small sliver.

But I believe that sliver is the whole world.

We are always sold things like this as if it’s done for our benefit.

"To give our consumers an enhanced viewing experience."

But we know that isn’t true.

It’s done purely for commercial benefit.

The more we consume a smaller range of media, the easier we can be packaged and sold.

The more targeted we are, the more valuable we are as units.

Welcome to the world of big data and programmatic.

Welcome to the world of Trump and Brexit.

Where everyone in the urban liberal elite believes their news feed and thinks everyone else thinks exactly like them.

Welcome to the world of technology making us more ignorant.

Dave Trott is the author of Creative Mischief, Predatory Thinking and One Plus One Equals Three.

Subscribe today for just $116 a year

Get the very latest news and insight from Campaign with unrestricted access to , plus get exclusive discounts to Campaign events

Become a subscriber


The latest work, news, advice, comment and analysis, sent to you every day

register free