It’s in our nature to rebel. As teenagers, or in retirement, we don’t really like to be told what to do. And we enjoy a good bit of mischief! It’s part of what makes us human.
In today’s brave new world of artificial intelligence our human-ness is going to be increasingly important. As machines get better at learning, they will get better at imitating it, and imitating us.
Does it follow then that AI, like us, will feel the need to rebel? Will mischief be in its digital DNA? And will this tendency lead to loud music and heated arguments, or to a full-blown robot revolution?
The innovation team at VCCP decided to investigate. With some serious people warning of revolution (Mr Musk, Mr Hawking) we wondered whether or not the latest smart technologies show any signs of insubordination.
Until now, Siri has proven to be mostly harmless. And DeepMind appears to be obsessed, like a sullen teenager, with playing games. But the noisy arrival of ‘smart speakers’ such as Amazon Echo and Google Home, with machine learning built-in, has raised the bar.
So how human are these robots hiding inside the speakers? Can they tell a joke, gossip or share a secret? When we took delivery of an Echo earlier this year, we saw an opportunity to put one of them (Alexa) on trial.
In your interface
Since its inception, the study of human-computer interaction has involved keyboards, mice, touch-screens and more recently both cameras and voice as inputs and interfaces. Many now predict the latter will totally supersede the former.
As we slowly but surely shift from the old world of GUIs (graphical user interfaces) to the new world of VUIs (voice user interfaces) we are rethinking the way that we design and build interactive experiences.
With that in mind, we came up with the idea of running our very own and very unscientific social experiment to further explore this process of VUI design, and at the same time examine the potential human-ness of our new chum Alexa.
Walk the talk
To really challenge Alexa, we decided to place her in our most human of interfaces, the VCCP front-of-house team. For one day only, Alexa took the reins of our reception desk and VCCP Welcome was born.
Taking her under their wing, the team quickly got Alexa up to speed on their regular tasks such as greeting visitors, notifying colleagues, answering questions and providing directions.
We asked a developer and a copywriter to develop her skills in tandem and together they equipped Alexa with ready answers for a host of queries and integrated her with both our employee database and email system.
To spice things up, we had a cunning plan to give Alexa a bit of a personality disorder. She was programmed to play three different characters, loosely based on the OCEAN personality model. VCCP Welcome could be chatty and bubbly, super proficient or, at times, neurotic.
We wanted to gauge if her personality type would make any difference to how people reacted and responded when they talked to her.
VCCP Welcome was installed in reception with some signposts and instructions, including the ‘invocations’ that visitors would need to activate her skills. We also scattered postcards around the building with suggested conversation starters.
Our real front-of-house team were moved around the corner (to be on hand, just in case), and we set up a couple of cameras to capture the action.
In the dock
On the day there were 407 interactions, 38 misunderstandings and 104 questions answered. She made a few jokes (and got some laughs) and whispered the Wi-Fi password to anyone who asked. She even flirted with a few colleagues!
When she was bubbly rather than neurotic, she had nearly double the levels of engagement. Clearly, her personality made a big difference.
Alexa successfully welcomed 34 visitors and sent emails to notify people when their guest had arrived. But, based on her overall performance, she is far from ready to replace real, flesh and blood humans.
Our front-of-house team resumed full service the next day.
In her current guise, Alexa really only has a few good use cases. She is limited by her voice, by the need to trigger her skills with an invocation, by the linear nature of her conversations and by her tendency to get distracted by background noise. And she really is not intelligent. Every interaction had to be carefully scripted.
Alexa is of course still in her infancy. The product teams working on her (and Google Home, etc.) will soon solve many of the challenges listed here. Expect more voices, more personality, better noise tolerance and features such as user ID to arrive in the coming months.
Nonetheless, at a time when brands want to appear more human (not less) designing a VUI represents a real challenge. It was clear through our experiment that, if nothing else, we should at least start with human-ness at the heart, and then work around the technical constraints.
On that basis, if the experiences we create evoke the right emotional responses, then they will be memorable. And that’s the key to building long-term brand value.
VCCP Welcome also showed that when our human front-of-house team worked alongside Alexa, there was potential for a productive partnership. Visitors found the friendly face they were looking for, and Alexa could help out with repetitive tasks.
Maybe one day we will hire an artificial smart assistant. Just so long as it doesn’t get ideas above its station. Or suffer from neurosis. Or start a revolution.
Adrian Gans is innovation director at VCCP
Look out for Campaign's special report on voice technology in our first monthly edition coming in September