This is part one of three in our deep-dive, long read series, into the relationship between UX design and the growth of immersive technology, created and authored by Threesixty Reality. We hear from leading experts in the field on the crucial UX improvements that need to be applied in order for virtual and augmented reality to be widely adopted.
XR promises more intuitive and natural ways of interacting with information. In both VR and AR, we can use our hands, gaze, gestures and voice to directly interact with content and manipulate virtual objects, but perhaps we’ve be lead to assume that immersive tech is already usable out of the box. Today, we consistently find evidence that the UX is still in its infancy and improvements to UX design are not being focused on enough in the immersive technology space. AIXR have covered accessibility issues with UX design recently, check that article out here.
At Threesixty Reality we have an immersive tech usability lab, where we test a wide range of AR and VR applications with target users. We repeatedly see most users struggle to use these devices effectively, and even mastering the basic controls can take some practice. Let’s not forget that humans have been trained for the last 20 years or so to interact with flat 2D menus on screens, with their finger or with a mouse. The transition to immersive tech isn’t automatic. The interactions are mostly all new and UX design conventions for virtual and augmented reality are just starting to emerge.
Ryan Gerber, Systems-level UX and product design at Vive VR points out: “Even though we’ve begun to see enterprise readily adopt XR solutions, largely due to a quantifiable return on investment, the much more massive market of normal humans is still largely skeptical around this technology’s ease of adoptability.”
In fact a survey of 140 industry professionals in 2018 by global law firm Perkins Coie found the top rated barrier for both VR and AR adoption was poor UX for a second year in a row. Although, in many cases these UX barriers relate to setting up the hardware, we need to also pay attention to the many challenges around how to best design immersive software and new UI paradigms, that are easier to adopt and provide a sense of familiarity as users move from application to application. Vik Parthiban, XR graduate researcher at MIT Media Lab highlights the need to prioritise interaction design for XR, “People underestimate the importance of interaction in AR and VR.”
In the XR industry, we tend to focus a lot on the immersiveness, the compelling 3D world, the sense of presence, low latency tracking that makes you think those are your real hands, the spatial audio, the 3D holograms that seem to obey the laws of physics. In other words, there is a fascination with the potential of the technology and what it can do and less with the step by step journey a human will go through to actually get things done and interact with the system effectively.
This sense of ever improving presence and graphical realism are qualities that make users say “wow!” the first time they experience modern immersive tech, but what we find in user research is that the vast majority of issues occur when the user tries to interact with objects and get the same level of return for their efforts as they would from a usable mobile application. These issues are often quite severe to the point where the user is frustrated and confused and quickly loses the motivation to continue. We hear comments like “I couldn’t get it to do what I was trying to do” or “I could have done this faster on my phone” all the time.
Ryan Gerber states “people want the experience as polished as they would expect from any premium smartphone brand. We’re in the Palm Pilot era of XR, the iPhone moment has yet to be realized.” He goes on to suggest that the hype around XR can also become a hindrance when it comes to UX, “folks looking to jump into this new technology, come with a number of assumptions and often unspoken desires or even dreams for what I call ‘the promise of this technology’…for users this translates as perhaps coming to VR or AR for the first time, excited and thrilled by all the hype they’ve read and heard from their enthusiast friend, and then they try it and it’s so alien to them.”
“People want the experience as polished as they would expect from any premium smartphone brand. We’re in the Palm Pilot era of XR, the iPhone moment has yet to be realized.” – Ryan Gerber, Systems-level UX and Product Design at Vive VR
Christophe Tauziet, a former Senior Product Designer for Social VR at Facebook echoes these thoughts, “Designing for VR is quite a challenge, mostly because of how immersive and unfamiliar the experience is to people who jump in for the first time. We’re used to technologies in which the experience is contained on a screen, and we interact either through a form of controller (keyboard, mouse, gamepad, remote, etc) or directly with the screen (smartphones, tablets, etc). With VR, you’re at the core of the experience and you directly interact with the World around you. Getting people accustomed to that can be a challenge.”
There are areas of UX design where we repeatedly see issues during usability studies that can potentially stop a user from progressing. Together with contributions from leading experts on UX in the immersive technology industry, we’ll run through some of the emerging themes and challenges in this three part series. The list of challenges is by no means exhaustive but nonetheless helps to illustrate the importance of adopting rigorous UX design processes in the development of XR applications and experiences. Good UX does not simply emerge on its own, it requires a great deal of effort, analysis, user research, iteration and ingenuity to get right. The solutions that work the best are often not obvious up front.
So let’s get started by looking at the first 4 UX challenges facing immersive technology and how we should shape UX design going forward with these challenges in mind:
1. Communicating interactive affordances
Users don’t always understand what is and isn’t interactive. We see users trying to interact with everything or missing the items they are supposed to interact with. How should the UI indicate what is active and what is available for an interaction?
Christophe Tauziet states, “The big challenge is to create an interface for people to interact with their virtual experience that’s as familiar and intuitive as possible, despite being an entirely new paradigm and medium”.
2. Selecting an area of focus for interactions
A user needs to be able to indicate which item in the interface they want to start interacting with. A specific challenge within this concerns how the system determines which specific application, window or field that should be the main focus within a user interface that can have multiple layers open at once and several interactive panels arranged in a 360 degree space around the user. Up until now this has been done with pointer tools such as gaze or a raycast, but smarter and more efficient solutions are emerging.
This has been a focus of UX work at some of the leading players in the industry, Tim Stutts interaction design lead at Magic Leap explains, “the interplay between finer focus of a cursor controlled via touchpad or 6DoF pointer, and courser focus afforded by head orientation or other behavior to determine the application in focus, are initial discovery points for any users starting out using an AR / VR device. Layering of UI, angling and occlusion of applications and objects add to the input challenge of shifting focus between one area and another.”
Ryan Gerber shares his reaction to innovations at HTC in this area, “Vive released the Vive Pro Eye ( a headset with eye tracking capability) and it’s spooky what a designer can begin to do with this technology— not even as a standalone input but paired to heighten the accuracy of every other, it can lend the system an almost mind-reading responsiveness in terms of how intuitive it becomes to human input”
A great deal of UX engineering and testing goes into getting the interaction right just for a single element such as the cursor behaviour in different scenarios. Stutts provides a small glimpse of the complexity involved, “It’s important when designing an AR/VR OS to invest a lot of time and effort in conceptualizing, prototyping and user testing cursor behavior. Dense grouping of buttons, such as virtual keyboards are excellent testing grounds for these ideas, since they experience a heavy amount of cursor traffic, but sparse groupings should not be neglected either, since inevitably with 3D spatial interactions there are going to be vast and sometimes awkward gaps between neighboring UI elements that a user needs to traverse. Haptics and spatial audio provide a multi-sensory feedback layer that can also serve to augment performance.”
Adrian Leu, CCO of Emteq, a start-up that allows VR users to interact with content using non-verbal communication signals such as facial expressions, heart rate, and posture, explains what’s already becoming possible, “Emteq’s hardware and software is integrated into VR headsets and can detect a breadth of emotional and biometric data from the user in real time. Not only can we collect these emotional triggers, but we can use them as an input source that can affect the content that the user experiences. For example detecting negative emotions from the user could trigger support prompts in a complex procedure or reduce the difficulty in a game. Such signals can also be used as navigation prompts through the experience.” These innovations are redefining the way we interact with computers by increasing the sense of presence and combining spatial and affective computing.
Gestures offer another efficient way to control the UI. Vik Parthiban has been has created a framework for interacting in XR with hand gesture and explains the benefits, “Giving the user the ability to interact with a really large display with just their finger, gives them a very powerful feeling. It gives a magical feeling of control.”
3. Direct manipulation
One area where XR is moving towards the promised experience is that of hand tracking and full finger articulation, a capability that is being realised to great effect by Leap Motion (recently acquired by UK company Ultrahaptics) and Hololens 2. It has always been an ambition for XR to allow us to use our hands naturally and this a step change for how humans interact with digital information with truly game changing potential.
Vik Parthiban explains the significance for XR, “The keyboard and mouse are great with small 2D displays. Point and click are easy in this context and users can easily perform quite advanced interactions such as drag and drop. However, beyond the desktop the mouse and keyboard stop being effective, particularly with larger displays (e.g. the 360 degree displays of XR) where this becomes inefficient and cumbersome. The immersive, first person context of XR means that users just want to reach out and grab things.”
Parthiban continues by highlighting the significant progress we’ve seen with Hololens 2, “Hololens 2 has a very powerful way of interacting with objects using the hands. Users can move, rotate and resize holograms just by grabbing them with their hands and performing intuitive real world gestures, making the experience considerably more efficient than the first Hololens. People underestimate the importance of hands. In the enterprise context the worker needs to use their hands to perform the work, so we need hand tracking based interaction as holding a controller is not an option and we need hand tracking based interaction instead.”
4. Ensuring intuitive interaction via controllers
The mapping between hand held controllers and on screen actions is not always clear. Users need to know when they should resort to a controller button press and which of the many buttons they should specifically press. This can be exacerbated in VR when the controller is not adequately visualised in the 3D virtual environment, as is the case with many VR games. We’ve observed users needing to remove the headset in order to find a particular button on the controller, which instantly destroys the sense of immersion.
It’s also complicated by multiple input possibilities competing in an unpredictable way: sometimes a gesture us used, sometimes a direct manipulation, sometimes a raycast, sometimes voice and sometimes it’s a dedicated button on the controller. The user must consider many overlapping possibilities that can place a significant cognitive strain on even simple actions.
Nicole Stewart-Rushworth, Immersive Lab Manager at the UK’s Digital Catapult, has observed 1000s of users trying XR devices and echoes this view, “VR controllers are not 100% intuitive, human factors are still there e.g. controllers colliding into each other and headsets. This is often solved by displaying the controllers in VR exactly how they look in real life, but this can break the immersion of, for example, a historical piece. Without a rendered replica of the controller the user has to learn the buttons and size of the controller before donning the headset. A user also has to understand the limitations of the controller tracking: are they 3DoF or 6DoF? Will they still function behind or close to my head?”
Some argue that any reliance on controllers is a limitation in the long term. Vik Parthiban is one of those making this case, “As XR headsets become more untethered and mobile, a user doesn’t want to carry a controller around all the time. It’s not truly immersive when you need to rely on the controller.”
In Part 2:
In part 2, we’ll continue to look at some of the other UX design challenges that are specific to XR with more contributions from leading UX specialists in the field. Experts from Magic Leap, Vive VR and MIT turn their focus to the challenges of orientation in 3D spaces, voice input, coherent system design and transitions between apps while in VR.