Insights EDU

Experts from Vive VR and Magic Leap Talk UX Design Challenges – Part 2/3

Alternative Text

Filip Healy | Co-Founder | Threesixty Reality

16 Aug 2019 | 9 min read

In part 1 of this 3 part series on UX design challenges facing the XR industry, we highlighted the growing evidence that many users struggle with the ease of use of many immersive applications, whether in VR or AR. The UX community for XR is growing and we are learning how to make the interactions more advanced and more intuitive at the same time.

We previously covered some of the specific UX design challenges facing XR creatives who are trying to include a high degree of interactivity in their XR experiences, such as how to clearly communicate what elements are interactive, how the user is able to select an area or object to interact with, how we’re moving towards more direct manipulation as well as the issues associated with the use of hand held controllers in XR. 

In part 2, we’ll continue to explore further UX design challenges that experts in the industry are talking about from orientation in 3D spaces to information layout and menu interfaces. 

Orientation in 3D spaces:

The orientation of the user with respect to critical UI elements in the 3D space introduces new, previously unseen UX design challenges. Too often we see users stuck, purely because they are facing the wrong way and the menu they need to interact with has appeared to the side or even behind them. How do we indicate which direction the user should face? How do we direct their attention in a 3D space? 

On top of this there may be a number of applications or elements open, which need to be arranged and oriented with respect to each other, particularly where there is a hierarchy between them or a need to use certain features in combination.

Christophe Tauziet, who previously worked as a designer on Facebook Spaces recognises this challenge; he states that “It often starts by getting people comfortable with the idea of looking all around them (sound or visual effects can help drag people’s attention and make them look beyond what’s directly in front of them).”

Facebook Spaces – social VR platform

Another challenge is that the real world still exists in this 3D space, even if the user feels completely cut off from it. Immersive Lab Manager at Digital Catapult, Nicole Stewart-Rushworth, raises this point, “VR/AR have similarities in terms of the UX design challenges that exist when designing for 3D space. For example, how to make sure the user remains aware of real life objects either when they can’t see them (VR) or they’re being mixed with virtual elements (AR).”

UX design challenges with voice input:

Voice interaction is often mentioned as a solution to many interaction problems. The user just states their intent with natural spoken language and circumvents the challenges of interacting with the graphical user interface altogether. However, even ignoring any limitations of voice recognition technology, getting this right in terms of UX is not a simple task. Often applications only accept specific voice commands so these need to be communicated to the user in a way where it’s clear what the commands are, but without cluttering the interface.

Often (as in the case of RealWear HMT-1 or Hololens) a specific visual treatment is used to indicate the voice commands. But what if multiple applications are open, each with similar or competing commands? What if there are windows behind the user which also have voice enabled options? What if there are too many voice commands to display at the same time? What if the user is just talking to someone?

Tim Stutts, Interaction Design Lead at Magic Leap, highlights some of these challenges, “A further level of complexity is added with voice commands, as the notion of directionality becomes abstract—the cursor for voice is effectively the underlying AI used to determine the intent of a statement, then relate it back to objects, apps and system functions.”

We need to also consider how the system provides feedback and differentiates states such as: ‘did not hear input’ vs. ‘did not recognise a valid command’ vs. ‘command valid but cannot be carried out right now’ vs ‘command carried out as requested.”

Nicole Stewart-Rushworth comments, “For users navigating through experiences using voice or gestures, there is a whole new set of vocabulary/gestures to learn and the added frustration of failed and incorrectly recognised inputs. However, these are problems that exist for all types of applications that use voice control and gestures.” 

Menus in XR:

What is the best approach to letting users make selections and control options, particularly in the context of a 3D spatial environment. Pressure to innovate leads some designers to discard familiar 2D menus altogether, opting for ‘physical’ interactions, when in fact these introduce unnecessary barriers and issues of discoverability. 

Ryan Gerber, Systems-level UX and Product Design at Vive VR, notes the importance of progressively introducing users to this new medium in small manageable steps, “begin by distilling down the familiar parts of existing 2D paradigms; connected platforms, experiences and design languages, in order to help us define an initial map of features, tools and common interaction patterns that users expect from a more mature technology experience.

This might come down to a common UI element, a grid or carousel of tiles for example can still be a great way to display content to your hero.” He continues to highlight advantages for both designers and users, “a flat rectangular dashboard is good for many reasons; it’s built from things a V1 team can use on the fly, it’s immediately familiar with users”. However, familiar does not mean without any innovation at all. It can come with a twist that starts to utilise the possibilities offered by the new platform, as Gerber illustrates, “Daydream first did a parallax tilt to their tile design that I really enjoyed, they found a delightful immersive way to highlight an active tile.”

This is a view we hear repeated by many other leading UX experts in XR, including Christophe Tauziet, who points out, “a lot of the most effective VR interfaces today leverage either 2D interfaces that give people a feeling of control and comfort (“Oh, I know how to use this. I just need to touch those buttons with my fingers”), or familiar objects from the real world (door knobs, paintbrush, gun, etc). The learning curve for VR being quite steep, there’s a need for this level of familiarity and reassurance before we embark people on using more advanced interfaces like hand/arm/body gestures, eye-tracking, voice, etc.”

Control of menus (even if 2D interactive panels are used) and layers of interactivity are another challenge. How do users bring up and find the options when they need it? How do they find the commands for a specific object or item they want to work on?

How do we ensure menus don’t get misplaced in 3D space, whilst at the same time not obscuring everything in front of the user? How do users close the menus? These are simple, basic interactions that don’t require a second thought when designing for mobile, as the conventions are so well entrenched, but with immersive tech, where the user must understand the inter-play between the different layers, this can quickly lead to confusion and frustration – this is one of many big UX design challenges. 

Information layout:

Given the potential overload of sensory information, it’s important to plan the layout and density of augmented digital information in a user centred way. 

Gerber shares his learnings from work at HTC Vive, “Clear layout conventions become more important than ever in order for humans to be able to quickly take in relevant information and act on it. Text hierarchies might seem boring to some, but in a 3D space it’s all too easy to let your UI and information density grow out of control quickly.” 

Gerber continues, “there is often a tendency when designing a system or application interface to design something too sprawling, usually including several chaotic windows spread out in front of you, or to cluster your interface in some tightly defined parameters. We realized we need to cast a broader net in order to find a balance that made sense. For us, that meant roughly the size of our hero’s field of view.”

Coherent system design:

With a Windows PC we understand the various parts of the system pretty well. We understand the difference between the OS, the desktop, an open application window, and which menus and settings are for Windows and which for a specific application. With XR these elements may look very different to what we are used to, and so we cannot assume users will understand how they all relate to one another. Yet we are aiming for a system that allows users to easily move between applications and even different devices in order to overcome these UX design challenges. 

Ryan Gerber states that we need to move towards “not only an immersive and highly experiential system but an ecosystem that houses their persistent digital identity and allows them freedom of movement between hardware devices and layers of reality. Fluidity in many ways has become synonymous with ease of use.” An end to end music streaming journey may involve “someone traveling home from work, how they might go from a desktop to a mobile experience on the way to the bus stop, and then get home and want to seamlessly slip into an immersive music visualizer and relax.”

This also brings into question the meaning and usefulness of a ‘Home’, especially when Oculus, Steam, Viveport, Windows, iOS, Android, Daydream and even a specific app may all have their own concept of home, and several may be available to the user at a given point in their experience. How do they know which control should be used for which home and how can they tell if they are in the right ‘Home’?

Gerber reports a common reaction from users in UX testing,  “wait, which home am I in?”. He expands, “When asked to comment on the various home experiences many participants more or less describe being placed in a zoo. AT HTC Vive we’ve replaced home with our Origin, a transitive system to help you move natively through virtual and augmented worlds. Familiar UI elements and library tiles greet them, cast against a window into another curated world of our making.”

Transitions between apps in VR:

Related to the above, Gerber points out specific UX design challenges for VR, where a user can be cut off from both actual and virtual reality during moments of transitions between VR experiences, “How do we best ferry you between worlds? In VR, transitions often drop you into a bleak, abstract void space where logos are head-locked to your face so you can’t look away. You often lack any sense of physical presence and feel a sense of ‘proneness’. Being dropped in a black void while you wait for your next app to load, is often caused by copying experiences from mobile and desktop design. If you have to wait ten seconds while an app loads on your phone you can always look away or talk to someone, in VR especially we now have to figure out what to do with you for that whole time.”

In Part 3:

In the final part of this series, we’ll look at a few other UX design challenges specific to XR, with more contributions from leading UX specialists in the field. We’ll also summarise our thoughts on what steps we should be taking to improve UX and support the wider goal of the adoption of virtual and augmented reality. 

About the author:

Filip Healy is an experienced User Experience consultant, specialising in evidence based product and service design. He has been planning, executing and delivering user centred design projects for blue chip global companies since 2001.

Ryan Gerber:

Ryan is the design team lead for Vive Reality System. His work for HTC Innovation involves leading conversations across the company to unify the vision across all future Vive-connected devices, defining the baseline experiences and interactions that will define spatial and immersive XR platforms for years to come.

Adrian Leu:

Adrian is CCO of Emteq, a start-up that allows VR users to interact with content using non-verbal communication signals such as facial expressions, heart rate, and posture, where he focuses on commercial market strategies (sales and marketing). He was previously CEO of Inition Ltd, a prominent immersive technologies and applications studio in London.

Vik Parthiban:

Vik is a researcher at the MIT Media Lab. He is building new holographic interfaces and hardware that will push AR/VR technology forward. Vik is passionate about teams and technologies that shape how people interact with digital information and the physical world. He also directs new research projects as Graduate President of VRAR@MIT and member of the Space Initiative.
Nicole Stewart-Rushworth:

Nicole is the Immersive Lab Manager at Digital Catapult, the UK's leading advanced digital technology innovation centre. It drives the early adoption of digital technologies to make UK businesses more competitive and productive. She looks after the strategy and running of the organisation's five labs around the UK, as well as presenting on immersive at education days and workshops.

Tim Stutts:

Tim Stutts currently works as an Interaction Design Lead at Magic Leap, nurturing user experience for the OS and core applications. He is a multifaceted designer drawn to challenges touching on interaction, sensory feedback, data visualization, augmented reality and artificial intelligence.

Christophe Tauziet:

Christophe is a Design Manager, currently supporting the Rider team at Uber. Prior to that, he spent 5 years at Facebook, leading different teams, including the Social VR team with which he designed Facebook Spaces and other experiences.