Blog

At Tech’s Biggest Show, UX (Mostly) Takes a Back Seat

There's no better place to observe an entire industry searching for the next great idea than the behemoth Consumer Electronics Show (CES) in Las Vegas, where thousands of products at various stages of market readiness are trotted out for journalists, customers, tech industry folk, and curious locals.

00---hero-image 1st-choice-b CES auto

A small group from ustwo -- Tim Smith (Auto & Mobility), David Fisher (Design) and Justin Pike (Business Strategy) -- traveled to CES to hone ustwo’s perspective on the trajectory of tech. Despite the potential economic headwinds in 2019, there was palpable optimism at CES about how immersive technology, AI, and connected experiences will shape mainstream sectors like entertainment and automotive.

Amid the endless array of dazzling technology demos, we noticed a surprising lack of attention to user experience: cohesive ideas for how people will interact with these new products now and in the future. The UX layer is what shapes an early-stage product concept or prototype into a truly meaningful product or experience that can change lives.

ustwobies*ustwobies Dave Fisher, Tim Smith, and Justin Pike at CES 2019 *

Immersive Experiences Work Out the Kinks

For the past few years, AR/VR technology has been ascendant at CES, and there is a general sense that the dawn of immersive computing has arrived. But some observers were disappointed: Mae Anderson described a subdued year for immersive tech held back by awkward hardware and uninspiring software.

According to the VR analyst firm SuperData, investment in consumer VR software declined 59 percent last year, perhaps indicating a perception that consumer hardware isn’t yet where it needs to be. It may be that the industry needs a few more years to create a truly mainstream immersive computing experience; advancements in hardware form factors, wiring, and machine learning software will all be critical for the general population to embrace VR.

Nonetheless, at CES 2019 there were many companies making strong efforts to push the boundaries of both VR and AR -- from new features like eye-tracking (HTC Vive Pro Eye), to sleek, new, mixed reality hardware that you might actually enjoy wearing (Nreal), to the resurgence of holograms (the Holobot holographic digital assistant), to conceptual experiences like “Rocket’s Rescue Run” (an in-car VR demo by Disney, Oculus, Audi, and the new Audi spinoff Holoride). Disney was also behind The Void, a location-based VR experience that combined advanced VR backpacks and hand-tracking sensors with old-school fans and heaters arranged throughout a room, amplifying the illusion of an alternative reality.

nrealNreal’s mixed reality smart glasses (Venture Beat)

These were all impressive demonstrations, but in the rush to showcase emerging technologies, CES also reveals what the industry lacks: the ability to evolve and test new UX paradigms to keep up with the changing tech landscape, especially in areas like auto and mobility that are being transformed by immersive technology, automation, and Internet of Things all at once (more on that later). Without that holistic perspective, firms are creating experiences tailored to hardware improvements, not human use cases. Alternatively, they are skipping ahead and trying out interesting brand partnerships and technology mash-ups.

Moreover, we share Marc Andreessen’s view that the tech industry has an early-adopter’s blind spot when it comes to VR technology specifically. Because it’s been in development for so long, we in the tech industry take for granted the seismic impact VR will have on everyday people, particularly underserved and disenfranchised populations who have traditionally lacked access to new products and services. Because of immersive tech’s outsized potential to change lives, we need more designers and engineers who know how to shape immersive experiences for many different types of people. Making these experiences properly for diverse populations takes a significant investment in the time and thinking required to create brand new interaction and design paradigms. Most of that work today is being done in niche fields like gaming. And while gaming has the framework to create worlds, games do not reflect most people's reality or the expansive potential of the technology beyond entertainment.

scorpionScorpion, a mounted machine gun with 3D gaming experience made by VR Leo USA for the virtual reality game Black Shield

Autonomous Vehicles: Back with a Vengeance

It’s important to remember that technologies fall in and out of fashion. Autonomous vehicles have been on the conference circuit since Norman Bel Geddes designed the General Motors Pavilion, known as Futurama, for the 1939 New York World's Fair. AVs were back in a big way at CES 2019, but if past is prologue, only the technologies that show they can make a meaningful difference in our lives will stick around.

mercedesMercedes-Benz's Vision Urbanetics concept van comes with a companion smartphone app that generates an augmented reality version of the route

At CES exhibitors were showing compelling new component technologies for AVs -- including sensors (eye-tracking, LIDAR, biometrics) and immersive tech. These will be useful for training the machines as well as creating production-ready consumer experiences inside and outside the car. Yet automotive / mobility is another area where the convergent nature of new technologies is creating a unique set of UX challenges. Suppliers who provide the computing layer to modern cars are quickly developing new business models and creating the foundational technologies and APIs for the automotive manufacturers to incorporate into vehicles. However, there is a massive gap between making the processing power and the data available and translating all of that into an experience drivers and passengers will actually value.

There is tremendous opportunity for design to act as a bridge between technology suppliers and automotive OEMs as hardware and data sources race ahead of use cases. How do a car company and multiple tech firms work together to develop compelling in-car features that are actually compelling for the driver and passengers? And who ultimately owns the data? These are questions that designers and engineers must answer together.

Furthermore, when tech companies and auto manufacturers partner up well, what practical features, smooth interactions, and moments of magic can they pioneer that are unique to the in-car space so that the auto industry can win over the mainstream? At CES the new “UX in the car” was presented mainly as a solution to the boredom of driving, rather than looking at the wide spectrum of human needs. We think the UX for driverless vehicles, and for assistive technology in and around the vehicle, is an incredibly fertile area for development. (It’s one reason we published a book on UX design for driverless cars, which you can download for free here).

Bright spots for “UX in the car” at CES 2019 included the Here Mobility app, a “social mobility” app for ridesharing, and Byton’s new M-Byte, their electric SUV that debuted at CES 2019. Both of these exemplify a UX framework that looks at the broader spectrum of human needs and the practicalities of future journeys.

In Byton’s case, the M-Byte vehicle itself took a backseat to the human-machine interface. As their promotional video states, “At Byton, we believe that the digital user experience will be the number one deciding factor for our customers.” Their focus was almost entirely on the UX of using the car, not only within the vehicle but as you enter and leave it, through connected experiences and a more holistic view of people’s mobility needs.

bytonByton’s take on the in-car Human-to-Machine Interface

Byton even went as far as to showcase the fact that they are establishing some UX design principles around gestural interactions in the car -- taking the interaction method one step further than other products like Mercedes’ MBUX. And of course, their in-car voice assistant shouted loudest of all: "BYTON. It is not a car, it is the next-generation smart device." You can tell we’re fans.

AI Assistants: The Elephant in the Room

CES 2019 may have been the biggest yet for AI assistants, with Google, Amazon, and Apple engaged in a pitched battle, and everyone else racing to show Internet-connected products that support all three platforms. But one of the key problems with AI assistants is that people still have a lot of trouble interacting with them. Users are forced to learn "a new language" or speaking cadence to interact with these assistants. If you go off track, the experience breaks. It's terrific to see the investment in integrating voice UI into so many products. But these systems won't be useful until they can interact with people using more natural language processes. And when it comes to a more natural interaction with an AI, the devil is in the data. Personalization -- a deep understanding of the user -- is key to innovating in AI assistants, and that means data privacy and security should be presented alongside all of these concepts.

hey googleGoogle Assistant billboard outside Las Vegas Convention Center (Las Vegas Review-Journal)

Mashable’s Pete Pachal pointed out that personal data was the elephant in the room at CES, with only three out of 300+ sessions focused on privacy, and few if any companies talking about how their products handled sensitive personal data. A standout exception was Apple’s privacy ad outside the Las Vegas Convention Center, a clever bit of consumer marketing and a gesture to the company’s peers and competitors in the industry. In its boldness, the ad exposed how invisible the privacy topic was everywhere else.

appleApple privacy billboard outside Las Vegas Convention Center (9to5Mac)

Conclusion

CES is less a finished product showcase than a public, living lab. And a big part of that lab is figuring out what gets people talking -- what new ideas get the most momentum. Many of the unfinished technologies on display at CES could one day converge into the next hit product. The New York Times’ Kara Swisher likens the hit-making process to an intricate “dance,” combining that spark of genius, money, and timing. We would add UX design to that punch list. When firms begin with the user experience in mind -- and put a multi-disciplinary team of designers and engineers to the task -- what they’re really doing is creating a narrative around the product that will draw the user in and unlock new potential. Storytelling is the heart of the dance.

It’s worth remembering that whenever the great sparks of the past arrived -- whether it was the IBM mainframe, the Windows PC, or the iPhone -- they worked because they told an optimistic story about the future. These products promised to bring meaning to our lives and to bring us closer together, and in many ways they delivered on those promises. We’re invigorated by our trip to CES and excited to explore how UX design can bridge the gap from concepts and experiments to the next great wave of product innovation.