In 1952, the Boston Symphony Orchestra was worried about falling standards due to nepotism. They thought conductors were choosing their own students over the best musicians.
So they decided auditions would take place with a curtain between the conductor and applicants.
If the conductor couldn’t see who was playing, they could only judge on ability.
But the results were disappointing.
Pretty much the same young men were picked again.
So the musicians were asked to repeat the audition, but take their shoes off first.
When they did, the results were very different.
This time, half the selected musicians were female – previously, there had been hardly any women chosen.
They thought they were being fair by not seeing the applicants but, subconciously, they could still tell their sex by the sound of their shoes.
What they were listening to wasn’t the music but their own subconscious bias.
Since 1952, blind auditions have become common and half of the top 250 orchestras are now largely composed of female musicians.
Subconcious bias also plays a large part in our era of faith in big data and algorithms.
Alongside another bias: quantification bias.
This is the belief in valuing the measurable over the immeasurable.
Cathy O’Neil is a mathematician and data scientist; she wrote Weapons of Math Destruction.
She says: "Algorithms don’t make things fair, they repeat past practices – they automate the status quo."
She says the reason for this is "Algorithms are simply opinions embedded in code.
"People think algorithms are objective, true and scientific – but this is a marketing trick.
"People trust and fear algorithms because they trust and fear mathematics."
She summarises: "Algorithms are not objective – the people who build them impose their own agenda on the algorithms."
Tricia Wang is an alumna of Harvard’s Berkman Klein Centre for Internet & Society.
She says: "Relying on big data alone increases the chance that we’ll miss something by giving us the illusion that we know everything."
She addresses the question: why is big data not helping us make better decisions?
She says: "Big data suffers from a context loss because big data doesn’t answer the question ‘why?’"
Big data is a $122bn industry in the US, where Wang advises companies on the use of technology.
She says: "Algorithms need to be audited, because quantifying is addictive.
"People have become so fixated on numbers that they can’t see anything outside of it."
That seems to be the problem with big data and algorithms in general.
As O’Neil says: "An algorithm is just data plus a definition of success.
"The data is gathered from the past, and whatever data is used is decided by the person building the algorithm.
"As is the definition of success."
So, far from being an objective measure, an algorithm is subjectivity plus more subjectivity.
The data used isn’t decided by a machine, neither is the definition of success.
Both are decided by flawed, biased human beings.
There’s nothing wrong with being biased – we all are.
The only thing that’s wrong is not being aware of the bias, and not admitting it.
Because the results of the algorithms are cranked out by a machine, we think those hidden biases are facts.
Dave Trott is the author of Creative Mischief, Predatory Thinking and One Plus One Equals Three.