What is it about robots that captures our attention, ignites our imagination and makes us think of technology in an abstract sense?
We’ve had filmmakers and storytellers craft tales of epic battles between humans and robots, reaching into the depths of what makes us us and what makes them them. We’ve seen wars of will, strength and philosophy, fought to the bitter end. Yet, despite all this exposure, a true understanding of these technologies eludes many a business leader.
The reason the stories we hear are so far from the truth is that real robots are pretty boring. Those mechanical arms that thrash about in car-making factories don’t get up to much after the lights go off, and those noisy little vacuum cleaners, despite having been caught spying on us (more of which later), don’t seem very exciting so we need to make up much more interesting stories.
Tales abound of generalised, artificial-intelligence-powered robots mooching about our human worlds, helping us out and making us laugh, before becoming self-aware and turning on us.
The good thing is that, throughout history, science has inspired art, and art has inspired science. This has never been so true as it is for robots. The early days of robotics inspired schools of thought and set in motion technological development that persists today.
The first robots that really made an impact on business were industrial ones. The Unimate, launched in the 1950s, kick-started a revolution in manufacturing. These machines typically carried out repetitive tasks that were often too dangerous, dull or dirty for humans.
By reducing reliance on humans, companies were able to increase output and reduce accidents. Since then, these robots have been growing both in intelligence and capability.
Proliferation across the commercial world continues, with spending at $71bn in 2015 and set to grow to $135bn by 2019, according to market intelligence company IDC. A recent study found that robotics investment makes a significant contribution to a nation’s growth: up to 10% of GDP per capita in OECD countries.
AI has paved the way for robots to enter more areas of the world of work, powering machine-vision and autonomy to enable robots to operate in complex and unpredictable environments. One Chinese postal warehouse, for example, is now capable of sorting more than 200,000 parcels a day, thanks to its army of robots.
Each year, at the Consumer Electronics Show in Las Vegas, a range of robots is paraded, some more useful than others. At this year’s event, a $16,000 (£11,620) clothes-folding robot, built in partnership with Panasonic by Japanese start-up Seven Dreamers, was showcased. The machine uses AI and image processing to match the garment to be folded against an expansive database of items to determine the best way to hold and fold it. Yet, as it stands, the product is insanely expensive, takes hours to fold a single garment and gets very confused when presented with dark clothing, so I suspect it’s not going to be popping up in too many Surrey pads just yet.
The company appears to be taking a leaf out of Tesla’s book by launching an expensive, early version and hoping to follow it up with a more affordable, mass-market product. It could be smart money for Panasonic, though, because there has been significant press coverage of the robot following its unveiling at CES.
Still, I have trouble actually defining this as a "robot". For me, it’s a machine, in the same way that my dishwasher is a machine. Is this because it is unable to move, has only a limited set of uses and no autonomy? Real robots, perhaps, are the ones that look like us.
The human touch
Ever since the Tin Man from The Wonderful Wizard of Oz reached our consciousness early in the 20th century, we’ve been fascinated by creating mechanical likenesses of our species.
In 1928, British inventor WH Richards built a robot named Eric from a suit of armour, and used simple electromagnets and a 12v motor to enable movement. Since then, the technology has continued to evolve, giving rise to some rather lifelike incarnations.
During the SXSW conference in Austin last year, Dr Hiroshi Ishiguro, director of the Intelligent Robotics Laboratory at Japan’s Osaka University, showed off his conversational AI robots. His aim is to create robots that mimic the physical appearance, mannerisms and behaviour of humans so much so that they overcome the "uncanny valley" – the phenomenon that inspires an eerie, cold revulsion in humans when robots look almost, but not exactly, lifelike.
‘The way things are going, it seems robots may become co-workers rather than overlords’
Sophia from Hanson Robotics is another key example in the field, and was the cover girl for fashion magazine Stylist’s 400th issue, published in January, which focused on a fusion of robots and pop culture. Sophia gained global fame when Saudi Arabia granted her citizenship at the Future Investment Summit, the first non-human to obtain such a status. Clearly, this was a classic publicity stunt, designed to associate Saudi Arabia with technology and innovation and a post-oil future. It also attracted some controversy, given how few rights are given to Saudi Arabian women, not to mention the country’s poor human-rights record in general.
Growing in maturity
Setting these attention-grabbing robots aside, though, how can brands build on this fast-moving area of technological development and social change? It is possible to map a brand’s involvement on a maturity scale, illustrating how far the brand has taken its exploration into the world of robotics.
1. Marketing and experimentation
Typically, brands start by dipping their toes into the water with a PR stunt involving robots, experimenting with something that plausibly could be an area to which robots will one day contribute. Take for example Just Eat and its Starship Technologies-produced delivery robots. In August 2017 one of the devices dropped off Just Eat’s 1,000th robot-delivered meal. Yes, it was more than a one-off. However, compared with the £49.5m-worth of orders that Just Eat processes in six months, it represents only a tiny fraction of its business.
Or consider McCann Japan’s AI creative director. While it attracted plenty of press coverage at its launch in 2015, we haven’t seen much output from it since, and certainly not many Cannes Lions awards.
What these examples do provide, beyond the PR value, is a deeper understanding of two things. First, how will our customers, competition and observers react to the idea that we’re starting to work with robots? And second, what’s it like to work with robots and what can they actually do?
2. Productivity enhancement
At this stage, a brand is able to apply robotics to its operations at scale and earn an increase in productivity. Industrial robots fall into this category, often in manufacturing contexts, as mentioned earlier in the piece.
Having acquired Kiva, the robotics company, Amazon now operates more than 80,000 robots in its fulfilment centres globally. However, this has not slowed the expansion of its human headcount, which was approaching 500,000 staff in 2017.
There are also some interesting examples beyond manufacturing and heavier operations. For some years, Nasa has been developing Robonaut in a joint venture with General Motors. This humanoid robot takes the shape of its astronaut colleagues so that it can operate in the same spaces and use the same tools.
Robonaut is capable of doing many tasks that a human can do, both inside and outside the International Space Station. Yet it transpires the astronauts are very protective of their role, so Robonaut ends up doing the stuff the humans don’t want to do, such as cleaning bodily fluids from internal surfaces. I learned this during a recent trip to Nasa in Houston, as part of a project my agency is involved in.
Back here on Earth, Ocado is developing a collaborative robot (cobot), called ARMAR6 Secondhands, with support from four European universities and the EU’s Horizon2020 programme.
The grocery-delivery company operates several big warehouses, such as its Dordon site, which covers an area of 90,000m2, housing 8,000 crates that can be moved about on 35km of conveyor belts. Maintenance is a big job in this warehouse and technicians need support.
The idea of Secondhands is that the cobot can understand what the technician is trying to do and provide support; holding a panel or passing a tool by responding to a voice command, for example, or even contributing pro-actively.
These types of collaborative robots are set to be one of the largest growth areas in industrial robotics. Jürgen von Hollen, president of Universal Robots, expects the market to grow by 60%-75% in the next year, reaching $2bn by 2020.
So, the way things are going, it seems robots may become co-workers rather than overlords.
3. Core service delivery
The final stage of maturity is when a brand is able to provide a service to end users directly via robots. Few brands have reached this stage, but there are early adopters.
A quarter of Japan’s population is over the age of 65, and, as the country is a leader in robotics, it is no surprise that a growing number of robots participate in the care of the elderly. One example is the Robobear, designed to help lift patients off their bed and into a wheelchair.
In the UK, Southend-on-Sea borough council is exploring working with Pepper the robot. Initially, the council plans to use Pepper for social care, including community engagement and awareness-raising of services, as well as to facilitate reminiscence activities for the elderly.
Pepper, a robot designed to understand human emotions, is being tested in various frontline service roles. Its maker, Softbank, claims that in one pilot for the robot, a tech retailer in the US recorded a 70% increase in foot traffic, while another store reported a 13% increase in revenue and a six-fold increase in sales of a featured product.
At this stage, it’s quite possible that these numbers are due to the novelty of the use of the robot, as opposed to a meaningful business case. Certainly, my own experience of working with Pepper on a research project is that the promise does not quite match the reality.
Last year, Walmart announced that it has begun using robots to scan shelves at 50 locations across the US – checking inventory, prices and misplaced items. The goal is to save employees from carrying out tasks that are repeatable, predictable and manual.
Over time, it is reasonable to assume more brands will experiment with how they can deliver their core services to end consumers. Perhaps the most immediate route in for brands is via the voice-based home assistant. The recent rise in popularity of these home assistants, such as Google Home and Amazon Alexa, have made it normal for us to have an AI-powered machine in the home that we talk to daily.
As the range of services that brands offer through these devices expands, they may well give way to the addition of physical machines, capable of moving about our living spaces.
Speaking of the home, devices to ease the burden of housework have been populating our homes for some time. The iRobot vacuum cleaner from Roomba was one of the first; now, Ecovacs, a robotics cleaning company, aims to have a robot in every home. However, Roomba has illustrated how easily consumer trust can be damaged; last year it emerged that its devices had been collecting data on the homes of owners.
As we empower machines to become representatives of a brand, there is a risk that the brand will be compromised if the machine fails. Brands have learned this the hard way through mistakes on social media: remember Microsoft’s Holocaust-denying chatbot Tay? Much has been learned since those dark days of early 2016.
Smart brands have grasped how to embed their character into chatbots. The next stage is to apply this thinking to the physical manifestations of brands in robots. Interacting with a brand such as The North Face robotically in your home might mean a brand-facilitated channelling of a pre-holiday rock-climbing masterclass from Renan Ozturk, for example.
A question of ethics
As with any foray into a technology that has an impact on society, there are ethical considerations to keep in mind. Science-fiction writer Isaac Asimov’s three laws of robotics have provided a basis for many a book, film and debate on AI ethics.
• A robot may not injure a human being or, through inaction, allow a human being to come to harm.
• A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
• A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
However, the ethics get more complex as robots pervade the darker areas of the human psyche. It’s hard to ignore the headline-hitting rise of sex robots, in both male and female form. When businesses such as robot brothels open, the real debate begins. As humanoid robots become lifelike, it raises questions about the desensitisation of sexual violence. And what happens if these robots ever obtain AI capable of consciousness? Should they be recognised for citizenship? How will their rights be protected?
Perhaps the answer lies in getting the next generation of robot creators off to an early start. Misty Robotics, a spin-off from Sphero – the team behind the lovable BB8 Star Wars toy – has launched a robot that is designed to be programmed by those who have never worked with robots.
One thing is for certain: robots are here to stay and you’re likely to see more of them. So it’s time to start experimenting and learning how to work with them.
By David Caygill, managing director, innovation and ventures, Iris Worldwide