I’ve always been sceptical of futurist predictions, like the provocative statement above. How can anyone predict the future? And if they could, why the hell would they tell us all what was going to happen?
After listening to exceptionally engaging Amy Webb at SXSW, I realized that predicting the future isn’t really what futurists do. What futurists actually do is facilitate groups of people work through a highly structured process of coming up with as many hypothetical futures as they can, in order to prepare for more or less anything.
And the reason Amy Webb is so forthcoming about sharing her future scenarios (she has made her trend research open source), is because the implications for humanity might be profound. She actually labels these alternative scenarios ‘optimistic, pragmatic and catastrophic’.
One such scenario is that during the next decade, we will start to transition to the next era of computing and connected devices, which we will wear and command using our voices, gesture and touch.
The transition from smartphones to smart wearables that have biometric sensors and speakers; rings and bracelets that sense motion; smart glasses that record and display information, will forever change how we experience the physical world. We will likely though still use foldable and scrollable screens for portable, longer-form reading and writing.
We can already see the early signs of this future happening today with smart wearables that promise a new kind of digital reality.
Facebook, Google, Microsoft, Baidu, Alibaba, Snap, Tencent, and Apple have all announced sizable investments in AR to weave a digital overlay into to the physical world.
It’s already impressive on our mobile devices, but will become even more immersive when the confines of the screen are removed. I know what you’re thinking, what about Google Glass? Well, things have already moved on at pace since then (especially within Google). Just check out the very normal looking Vaunt glasses from Intel that uses retinal projection to put a display in your eyeball.
Poppy Crumb from Dolby updated SXSW on the state of hearables with audio based AR - which provides a layer of audio information over the physical world. She talked about the ‘embodied user experience’, which seamlessly closes the loop between our internal and external worlds, making technology a very natural extension of the self.
Maybe Bose were listening in on Poppy’s talk from last year, as this year they debuted their prototype of audio AR. The company plans to ship 10,000 of their Bose AR glasses to developers and manufacturers this summer, with the intent of partnering with other eyewear companies. They combine Audio AR with data from embedded motion sensors using GPS from your phone. GPS detects where a user is, and a nine-axis sensor can determine which direction they’re looking and moving.
All this convergence could be actually be positive, if it genuinely removes addictive screens that distract us, forcing us to look up and re-connect with each other and the world around us in ways that augment our collective experience. Now that’s something that we can all be more optimistic about.
Matt Dyke is founder and chief strategy officer at AnalogFolk