Pictures of hearts make people smile.
That’s one of the first lessons from a project that M&C Saatchi recently launched in London that uses Microsoft’s Kinect technology to track reactions to a billboard ad. The ad is for a fake instant coffee brand called Bahio, so the point isn’t to move product but to see if ads can be improved over time.
Dave Cox, the chief innovation officer at M&C Saatchi, envisions a future in which billboards can constantly change to offer the most optimal message for the moment. For instance, when there was road work done near the ad (on Oxford Street and Clapham Common), consumers inexplicably began preferring somewhat longer messages. Pre-roadwork, five words of copy hit the spot. When the work carried on, six words seemed to work better.
The Bahio experiment is based on a binary formula: A smile means people like what they see, not smiling means the opposite. Yet the technology behind computer-based interpretations of emotions is progressing. For ad agency types, this may not be great news. Cox envisions a day in which "artificial creativity" may supplant the human kind. As machines get better at understanding humans, they will take their jobs.
From movement tracking to empathy
Microsoft introduced Kinect as part of the Xbox experience in 2010 to compete with other motion controllers in Nintendo’s Wii and Sony’s PlayStation 3 and 4. In 2011, Microsoft made Kinect available to developers, and a few have grafted it onto advertising. For instance, a 2012 interactive ad for Autism Speaks featured a little girl who avoided eye contact no matter how hard viewers tried to get her attention.
Though M&C Saatchi may be the first to use Kinect to gauge emotional responses to ads, Kinect isn’t the only technology to seek to do this. Affectiva, a Waltham, Mass., company spun off of MIT, analyzes consumers’ emotions on behalf of 1,400 brands. The technology has mostly been used to test advertising, but it has informed a few recent interactive ad efforts as well.
For instance, the automaker Bentley has released an app called the Inspirator that asks consumers to turn on the camera. Then it reads their faces as the app serves up a narrative. "Based on how you emotionally respond to that narrative, it gathers data; it processes data; and in the end, they customize this Bentley car based on the emotions you betrayed during that narrative," says Gabi Zijderveld, VP of marketing at Affectiva. Hershey is also planning to launch an interactive kiosk in supermarkets this fall called the Smile Sampler that rewards shoppers with chocolate if they smile. That too is based on Affectiva technology.
While those might sound like cute one-offs, Zijderveld sees such advertising progressing over time. "I think what we’re going to see more is more personalized advertising where depending on how you respond to advertising, the advertising adapts and adjusts to personalize your experience," she says. "There’s nothing really out there yet, but we’re having many conversations with creative agencies."
Another emerging field is "emotion-based targeting," Zijderveld says. Such targeting is based on demographics. For instance, if your target is a 45-year-old woman, then she is likely to be at a certain emotional frame of mind depending on the time of day and so forth.
Though that sounds ridiculously reductive — do all 45-year-old women feel the same emotions? — Zijderveld says it’s based on a huge amount of data and therefore likely accurate. "If you have that demo in your sample group and you’ve done analysis on facial expressions, then the next time someone comes around that has that demographic, you can better target ads to that person because they’re likely going to resonate based on that lookalike data you have," she says. In that situation, the ad doesn’t have to see the woman’s face. "If you have massive amounts of data you can probably identify patterns," Zijderveld explains.
While that may sound a bit creepy for consumers, Cox’s talk about "artificial creativity" should send a chill up the spine of human creative directors. In the Bahio campaign, for instance, the agency has used thousands of iterations. Machine learning is employed to find the most effective ad.
Machines can also be used to create the ads. As Cox points out, The Associated Press recently began automating certain stories. (Disclosure: This story was written by a carbon-based life form.) Why can’t the same be done for advertising?
For billboards at least, Cox might have a point. The Bahio campaign has come out with some "pretty bizarre stuff" that has resonated with consumers in ways that a human-created campaign might not. However, it would be tricky to try to take the idea to other forms of advertising, like TV, online and mobile, where consumers aren’t as willing to get on camera. Still, mark your calendars: In 2015, the idea of machines muscling in to creative work is now plausible.
"As artificial intelligence gets better, it stands to reason that we’ll be using AI to make ads," says Cox. "But not yet. I’ll give it a few years."