The default tagline for Facebook’s Year in Review app declared: ‘It’s been a great year! Thanks for being a part of it.’ Yet, of course, among its 1.23bn monthly active users there were those whose experiences had been anything but great.
For Eric Meyer, a web-design consultant and writer, 2014 was the year in which he lost his daughter to brain cancer on her sixth birthday. It was her picture that automatically became the lead image in Facebook’s latest attempt to drive engagement.
Algorithms are essentially thoughtless. They model certain decision flows, but once you run them, no more thought occurs
In an eloquent and thoughtful post entitled ‘Inadvertent Algorithmic Cruelty’, Meyer wrote: "My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully. And I know, of course, that this is not a deliberate assault.
This inadvertent algorithmic cruelty is the result of the code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party, or whale spouts from sailing boats or the marina outside their vacation house."
He concluded: "Algorithms are essentially thoughtless. They model certain decision flows, but once you run them, no more thought occurs. To call a person ‘thoughtless’ is usually considered [an] insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves."
This thoughtlessness has big implications for brands. Ben Silcox, chief data and digital officer of agency Havas Helia, says three issues affect the algorithmic ecosystem: its immaturity, lack of transparency and inherent bias. "There is a fundamental flaw in the way algorithms are written," he warns. "They were written as an individual experience, but are now a media platform."
In the marketing sphere, where brands have established frameworks and guidelines designed to service a now irrelevant traditional media ecosystem, there remains much to learn. When a seemingly banal piece of coding has the power to create an emotional response in consumers, brands must tread carefully.
Simon James, vice-president global performance analytics at digital agency SapientNitro, notes that brands are exploring and innovating in a medium that didn’t exist a decade ago. As such, a certain amount of ‘experience-based learning’ is taking place.
"I don’t believe that brands think of ethics and algorithms in the same sentence," he says. "The vast majority of brands, regardless of what they might say publicly, have no wish to be the first to do anything – purely for the reasons of reputational risk. Your appetite for growth is tempered by your appetite for risk. Learning in our industry rarely comes from academia, it comes from trial and error – hopefully, somebody else’s error."
Digital capitalism has brought with it a quagmire of ethical issues; with the big data that promised marketers the Earth has come the sharp gaze of public scrutiny. The barrage of negative publicity that accompanied the launch of Facebook’s Year In Review product is one in a list of very public failures.
Simon Hathaway, president and global head of retail experience at agency network Cheil, says algorithms have moved from the business sphere into the public consciousness. "A year ago marketers would say consumers were willing to give away their data for greater personalisation; now there is more debate about data’s role and [how far] people should be tracked and targeted."
To date, when faced with the complex ethical and social issues churned up by our algorithmically powered digital ecosystems, the industry has been content to conclude that the solution lies within – specifically, in relying more on the humans, or preferably big retainers to specialist agencies. It is a self-comforting argument, but one that fails to grasp the true complexities of our ever-evolving digital environment.
I don’t believe that brands think of ethics and algorithms in the same sentence
The industry’s tendency to see any algorithmic conundrum as being solved by the human touch ignores the fact that computer algorithms are intimately tied to people. When we think of
analysing consumers’ purchasing habits, we need to focus on
the marketers and programmers behind the algorithms. In fact, some of the most complex tasks, such as that of content moderation, remain firmly in human hands.
Marketers’ ability to use social media as a mass-market ad platform relies heavily on the social network’s ability to police the borders of its content. This is not something that companies such as Facebook and Twitter have outsourced to an algorithm.
Instead, they rely on an army of content moderators who watch and remove horrific content daily. Many employers offer them counselling, but, while you can wipe a hard drive, the human memory does not have a simple erase function; nobody knows what long-term impact this work will have on the moderators.
Content outsourcing has the potential to inflict significant reputational damage on the digital marketing industry. "It is a huge issue and a fundamental one; just like the sweatshops in fashion that delivered cheap clothing, we are now facing the same pressures in digital outsourcing," explains Hathaway.
It is to marketers’ credit that brands are not following the example of the platforms in outsourcing social business. James Whatley, social media director at Ogilvy & Mather London, says that, in theory, social-media management could be outsourced in the same way that call centres are, but brands are too nervous. "From a brand level, I don’t see outsourcing happening, but from a platform level, we don’t see that level of content moderation and I don’t think people are aware of who is doing what," he adds.
Despite the myriad issues over the ethics of how algorithms are deployed, some agencies are yet to be convinced that it is a consumer issue. Amy Kean, head of futures at Havas Media, believes the Facebook debâcle is a "sensationalist non-story that once again presented a global business as the bad guy".
She explains: "We have to remember that media owners who report on these events make money on advertising from consumer brands, so The Guardian and Daily Mail, for example, have every reason to criticise the approach of websites more popular than their own. If stories about ‘cruel algorithms’ decrease consumer and brand trust in a platform, then marketing money may shift."
Yet such an approach risks leaving brands exposed to criticism. Facebook did not set out to upset or offend its users, but the result was the same. When brand messages are served up alongside upsetting or disturbing content, marketers can quickly find themselves in the firing line.
Jason Carmel, global senior vice-president of marketing services at creative agency Possible, says an arrogance exists in the industry. "We are conflating the fact that people are sharing more information with meaning that they don’t want privacy."
Facebook is, no doubt, in a unique position; outside the grocery and ecommerce giants, not every given brand has an unending stream of data. Yet most do advertise on social networks that are making greater use of algorithms to determine what users see. "Brands are already doing risk assessments, but you have to manage the competing forces of potential brand damage against all the effectiveness you can achieve," explains Robin Grant, global managing director of social-media agency We Are Social.
A new framework:the information economy
Marketers are operating at a turning point: while our ability to collect data has risen exponentially, their ability to not just interpret that data, but also understand how to use it ethically and do so, has not. Havas Helia’s Silcox says that while the EU’s controversial ‘right to be forgotten’ ruling ignited the debate, simply talking about the issue is not enough. "There is a need as an industry for a broader consensus. We need a trade body or independent body – an FSA for data."
Luciano Floridi, director of research and professor of philosophy and ethics of information at the Oxford Internet Institute, who advised Google on the ‘right to be forgotten’ ruling, has argued that we need to build a new framework to understand the philosophy of information and meet the ethical challenges of the digital age.
In line with this we need to shift our understanding of ‘the self’ to incorporate data. So, in essence, your personal data is considered as much a part of who you are as your limbs.
As Floridi put it in an interview with Pacific Standard: "I can sell my hair but I can’t sell my liver." Similarly, kidneys can be donated, but not legally bought or sold.
With regard to our bodies, we have arrived at broadly accepted norms, but in terms of data, we are nowhere near. This presents a complex moral minefield for marketers, many of whom have been seduced by the promise of big data with little thought of the ethical implications.
The only way is ethics
Marketers could be forgiven for wanting to place these issues to one side and focus on more tangible goals. In the current climate, however, such a stance is not sustainable. Rightly or wrongly, brands have been appointed the moral guardians of our time.
Consider the outcry when it was reported that convicted rapist Ched Evans was poised to sign for Oldham Athletic; the football club’s sponsors found themselves at the centre of the storm.
Even brands operating on the periphery felt compelled to get involved. Nando’s sole involvement with the club was providing chicken-based prizes to fans on match days; nonetheless, it still publicly distanced itself from the scandal with a tweet declaring: "We’d have liked to continue our involvement with fan prizes but feel we can no longer continue our association."
The speed and scale of the backlash provides a telling lesson for brands on being in the wrong place at the wrong time, whether because an algorithm has served up your ad next to some upsetting content on Facebook, or an organisation with which you have links is considering employing a convicted rapist.
"Brands are operating in the same space as consumers. They are at the behest of the platforms like Facebook," explains Whatley. Nonetheless, consumers have cottoned on to the idea that brands have a say.
"All the way down to Nando’s saying it won’t sponsor Oldham to concerns over content on Facebook, the pressure is on brands. Yet Twitter and Facebook are nascent ad platforms, and brands can only put the blame on the platforms."
This doesn’t mean brands can duck the issue or simply defer to the platforms when algorithms go awry. "When it comes to algorithms, brands don’t have the control, but they still need to be true to the brand values and take a stand," adds Whatley.
Beware of the "Filter Bubble"
Brands and marketers find themselves at the sharp end of the shift in how information is flowing online. As Eli Pariser, chief executive of Upworthy, a website for viral content, explains in his book The Filter Bubble, it will soon be hard for people to watch or consume something that has not in some sense been tailored to them. While this has empowered marketers with a high level of targeting, with the election on the horizon, the potentially corrosive or misleading impact of this filter will be in the spotlight.
Marketers must beware of being seduced by the possibilities of mass personalisation and question the ethics of the algorithms they employ. The small-font legalese that comprises the often-
referred-to, but rarely read by consumers, terms and conditions relied on by the world’s biggest technology companies, does not constitute informed consent.
This is not a question of human vs machine, or simply of more human oversight for algorithms, but one of building a new information architecture driven not just by ethics, but recognition that serendipity and discovery are sacrosanct. In Pariser’s words: "We need the internet to connect us all together, introduce us to new ideas and new perspectives and it’s not going to do that if it leaves us all isolated in a web of one."
Mind the serendipity gap
With marketing personalisation in the ascendancy, more consumers are craving the unpredictable Personalisation has long been a lynchpin of marketing, and much of the promise of big data has been built on the ability of brands to better target consumers.
Yet there exists a growing concern among Millennial consumers over their perceived lack of control and the lost art of discovery.
Brands such as O2 are attempting to address this issue by engineering more serendipity into their algorithms. Richard Pentin, strategy partner at marketing agency LIDA, which works with O2, says that doing just this is a necessity for brands.
"Just because the data tells you a consumer is into this, that or the other, it doesn’t mean you have to constantly talk about the same thing," he says. "So a degree of human intervention is often necessary.
With O2 Priority we know from ticketing data that some customers love certain [non-country] music genres, but that doesn’t mean they’re not also partial to a bit of Dolly Parton." Elsewhere, premium fashion brand Miu Miu ran a campaign featuring Somebody – one of a new generation of social apps focusing on driving random interaction.
The app enables users to connect with strangers who act as their ‘stand-in’ to deliver messages, face-to-face, to friends.
This is one of a tranche of emerging products that underline a growing consumer desire for the unpredictable. This suggests that the algorithm-powered personalisation that is coming to form the core of our digital marketing ecosystem is out of step with consumers’ wants.
However, many marketers would question the logic of usurping the recommendation algorithms that have powered the success of brands such as Amazon and Netflix. "Amazon knows what it’s doing," explains We Are Social’s Robin Grant.
If it generated more revenues to throw in more random recommendations it would do so.
It’s a commercial decision." However, the issue becomes more interesting when you look at targeting by ecommerce platforms.
The risk is that marketers are, in effect, creating an echo chamber that reinforces consumers’ opinion. That may also have the effect of holding them back from splurging their cash on something more adventurous.