Insights EDU

How Can the Industry Work to Improve UX UI Design and Drive Adoption? – Part 3/3

Alternative Text

Filip Healy | Co-Founder | Threesixty Reality

23 Aug 2019 | 12 min read

In part 2 of this 3 part series on UX challenges facing the XR industry, we continued our review of the key areas where good UX is critical, such as orientation of user and content in 3D space, challenges around voice input, menus and information layout and the importance of thinking about a coherent system design as well as the transition between different apps. We’re particularly focusing on experiences and applications that include a high degree of user interaction and the part of the experience where users are actually immersed in the XR world to see to how best shape UX UI design around the needs of the user. 

In this final part we’ll continue to explore some further challenges that UX experts in the industry are talking about and discuss our thoughts on how we can bridge the UX gaps that exist.

Steep learning curves and intuitive UX UI design:

To become proficient and productive with XR applications, users need to learn a significant number of new interaction rules and this takes practice and a willingness on their part to push through the initial learning period. When we test XR applications with users in the Threesixty lab, we often need to train them in the basic interactions of the system first. From a developer’s perspective, we need to work to create the most intuitive and accessible UX UI design as possible in order to make the steep learning curve less daunting for new users. 

Tim Stutts, interaction design lead at Magic Leap, explains the challenges related to simple and accessible UX UI design:  “Intuitive is a goal designers should be aiming for, but it’s very difficult to attain. New VR / AR users may attempt to apply what they know from interacting with the other screen-based devices that they are familiar with, like the laptop, mobile phone, tablet, TV or gaming console. While taking those first steps they might even be inspired by what they’ve seen in science fiction, particularly with gestural input.  It’s possible that even with these reference points, users will still struggle in their initial moments of interaction, and that’s where learnability can help pick up the slack and instill confidence. When user testing interactions it is important to chart progress over time, not just rely on initial impressions to determine whether an interaction approach is a success or failure. Understand how easy it was the fifth time, where users tend to max out in terms of performance, and when users become fatigued.” 

Nicole Stewart-Rushworth, Immersive Lab Manager at the UK’s Digital Catapult, argues that learnability is further complicated by the vast differences in the interaction paradigms between different XR devices: “How a user interacts with each VR/AR device can vary quite a lot, a device may be: head-mounted or handheld; 3/6DoF controllers or no controllers; physical buttons and touch-sensitive areas on the device; voice control; gestures; tethered or standalone. All of these differing elements mean that each device needs to be learned, even if initial setup has been handled for a user.” 

Ryan Gerber, Systems-level UX and product design at Vive VR, shares his experience from usability studies conducted at HTC, “there was a natural sense of unease users felt, an unfamiliarity in how to exist within a seemingly familiar environment now void of the 2D input tools they had previously relied on. The first thing that’s pretty clear to anybody trying an XR experience is that it doesn’t feel human”.

Gerber continues, “Most people tend to be harsh critics of any experience with too-steep a learning curve. Most tutorials and out-of-box-experiences to date have been crude at best, or disorienting at worst.” 

This friction often comes as an unwelcome surprise to users who are used to just walking up and using interactive products without any second thoughts. Gerber compares the experience to a mobile phone, “Within a few minutes of pulling their new phone out of the box most people are soon chatting with their friends, casually scrolling all their favorite social media, probably streaming music and using highly evolved gestures, shortcuts and deep personalization that, in the end, provide a very clean and intuitive experience.”

Stutts proposes creative ways to incorporate training into the initial set up process, “incorporating learning opportunities into a platform for first time use as necessary steps in the user flow (e.g. using the provided controller to type out a required email address for account setup as part of teaching typing with the controller) and discoverability options thereafter (e.g. notifications and other UI to guide a user when they have gone astray), designers can compliment a user’s intuition.”

Stewart-Rushworth notes that things are already improving, “The last 12 months have seen a vast improvement in the desktop experience for new users navigating content in VR app stores (Oculus, Steam, Viveport, WMR Home). These platforms have moved away from information-heavy libraries to curated home pages. However choosing content in a headset has challenges due to the lack of a keyboard.”

Christophe Tauziet, a former senior product designer for Social VR at Facebook, describes how familiarity was used to reduce the learning curve for Facebook Spaces, “I’ve found that starting with a mix of 3D objects you can pick up and use that resemble an object from the real world you’re familiar with can really help, just like an icon does on a 2D interface. For example, in Facebook Spaces, there’s a virtual marker you can pick up to start drawing in 3D. For more complex menus with a number of options, leveraging 2D is a great way to reuse design paradigms and solutions people are used to (buttons, pagination, etc).”

Graceful handling of failures:

Sometimes the technology that XR depends on has a bad day. It’s really important that designers think about how this is handled as users don’t always realise something has gone wrong or how they can fix it.

One example relates to issues that can occur with tracking. Nicole Stewart-Rushworth explains, “We need to consider what to display when tracking fails. An uninformed user may try an inside-out tracked headset such as the Oculus Quest in an environment full of reflective surfaces or lacking in features and conclude it does not function in any environment. At the moment devices are inconsistent with the feedback they provide when tracking is not functioning correctly.”

The scanning of surfaces with mobile AR applications also carries UX challenges. On top of the technical challenges of achieving a good scan we cannot assume the average consumer understands what scanning is or how it works, yet the entire functionality of mobile AR depends on being able to do this correctly.

At Threesixty, we see frequent issues with users failing to scan correctly because the UI prompts and feedback do not guide them sufficiently well. Pietro Desiato, Co-Founder of Threesixty Reality explains further, “Mobile AR scanning works well on textured surfaces, and can fail completely in dim light or shiny surfaces. Most users don’t know this and can assume the technology just doesn’t work. Often there is insufficient feedback to educate users about correct use. In our usability tests we’ve seen digitally savvy smartphone users reduced to complete beginners when they try mobile AR. Some don’t even know how to hold the phone to perform the scan: where to point it, how far from the surface or whether they need to move it around.”

Designing for the whole body:

Ryan Gerber explains how immersive tech interfaces can respond to a much wider variety of human movements and inputs, “they need to be designed for the entire human, not just the eyes, ears and thumbs. We really had to stop and ask ourselves what it meant to be human. Now it’s like we’re relearning this skill for the computer age, giving us more dexterity beyond even our hands and fingers. ”

How do we use hand, finger, arm, head and eye movements to more efficiently and intuitively control a computer system? Sure, we can use our arm to swing a sword in an immersive game, but what if that same motion triggered ‘Copy text in focus and paste into a new message’? What if we reduced a large number of 1 minute duration  tasks and into split second interactions? How much productivity could we gain if we added all that potential up? 

However, these whole body interactions need to be recognised and learnt by users in the context of interacting with a computer. They may be second nature in the real world, but we’ve never interacted with computers like this. Also, just because such an intuitive interaction exists in nature does not mean we’ve perfected it enough in XR for it to be recognised and feel right to the user. 

One area where natural interaction is currently constrained is that of movement in virtual space (VR). Today’s mainstream solutions do not yet allow users to make use of their legs, so the concept of natural, intuitive, human like movement is non-existent and with this comes the side effect of simulator sickness with many of the work arounds. Gerber is hopeful, “An area I would be excited to see progress is clean walk mechanics, for now most systems and applications rely on teleportation primarily for navigation.”

Consistent design conventions:

The lack of consistently used conventions within XR causes issues, even for more experienced users, as today, each application requires them to relearn many things and even common actions can have completely different mechanisms from one app to another. Contrast this with the web, where most websites follow fairly established design patterns. First we need to find elegant design solutions for many of the challenges in this article, and then we need to make efforts to promote best practice and encourage designers to apply them consistently.

Tim Stutts sums this up, “Each AR / VR platform addresses these kinds of problems differently.”

Ryan Gerber makes similar observations, “Design and engineering teams lack the well-defined libraries and design languages we can so readily rely on for mobile and desktop and even 10-foot design. I think we’re ultimately still waiting for more people to come along and help us define these tools and systems. It’s nowhere near as simple to just reproduce features from 2D experience. Something that most UX teams begin to re-evaluate after their first XR-related brainstorm or perhaps after an initial round of user testing is their heuristics- their rules of thumb in approaching design problems in order to be successful.” This in turn highlights the importance of UX testing with actual users, an approach that is well established for web and mobile development and a core part of the ISO 9241-210 Human Centred Design standard for interactive technologies. 

At Threesixty we’ve started to document examples of emerging interaction patterns for AR and VR, often backed up by our research with users to start to tackle this issue, and others are also trying to do the same, particularly on a platform level.

What actions should the industry take?

To address these challenges, application and platform developers need to place a greater emphasis on refining the UX, particularly the digital interactions and where input mechanisms are concerned. Whereas the larger players are already developing a strong UX capability, the smaller, independent developers, which today form a large proportion of the content available, do not always have the skills, time or resources to do the same. There is a strong need for the platform providers to provide more training and resources specifically around UX. 

We’re still at a stage where little details and nuances in the design are really important in helping users to understand what they need to do. Little clues that are just enough to nudge users into trying the desired behaviour. These are the finer details of the UX and it’s almost impossible to figure out the right solution without carrying out some design research activities.

Encouraging a culture where applications and designs are tested with target users is of critical importance. Without this step, many UX issues will continue to slip through into live versions, particularly as analytics tools are not as well developed as for web and mobile apps. Gerber advocates the value this has brought at HTC, “Over the course of our cycles of design, testing and iterating we would often find areas where we could simplify our design more and more”

Yet this recommendation is also not without its challenges, as smaller teams have budget constraints and at the same time testing with users is not as straightforward as with other platforms, requiring a more specialist lab set up with all the hardware in place and a larger physical space. There are few specialist UX research and design consultancies out there at the moment to support the industry and traditional usability agencies often lack the domain expertise to provide relevant and up to date advice. At Threesixty Reality, we are trying to fill this void.

Another approach is to transfer the skills and experience that exists amongst UX designers in general over to the XR part of the industry. The majority of experienced UX designers still see their careers within the 2D world of mobile and web, at least for the time being. Gerber asks the question, “how do we activate all the existing 2D designers and content creators to adopt a new medium?” 

He explains that this is important as the challenge is as much for the designers of immersive tech experience as it is for the end users, “The challenge is that there is so much more to consider in our design- there are so many ways users rightly expect to be able to interact and engage within these AR and VR spaces. Suddenly a previously well-understood solution in 2D design, like a sign up flow or a tutorial, now has to consider all these new challenges and opportunities. We are still very much trying to solve the early interaction problems of XR, like how to navigate information, how to interact easily with menus and alerts, or even how to input text or emote our intentions correctly to others. 

We are designing for a medium that has seemingly very few limitations, not only are we not confined by the rigid rectangular hardware design that most of our 2D thinking has evolved to fit, we also are not limited by traditional constraints of spacetime and physics. We are not restricted by what is necessarily the most simple form of a thing, which I think mobile design especially demands from us. Even a senior team of engineers and designers might feel a moment of paralysis when taking on challenges for this space.”

In the meantime users are out there trying XR experiences for the first time and forming opinions about whether the systems are enjoyable to use, whether the content and applications are useful and whether to invest in a headset themselves, so it’s important that we keep taking steps to improve the overall level of UX as much as possible.

About the author:

Filip Healy is an experienced User Experience consultant, specialising in evidence based product and service design. He has been planning, executing and delivering user centred design projects for blue chip global companies since 2001.

Ryan Gerber:

Ryan is the design team lead for Vive Reality System. His work for HTC Innovation involves leading conversations across the company to unify the vision across all future Vive-connected devices, defining the baseline experiences and interactions that will define spatial and immersive XR platforms for years to come.

Adrian Leu:

Adrian is CCO of Emteq, a start-up that allows VR users to interact with content using non-verbal communication signals such as facial expressions, heart rate, and posture, where he focuses on commercial market strategies (sales and marketing). He was previously CEO of Inition Ltd, a prominent immersive technologies and applications studio in London.

Vik Parthiban:

Vik is a researcher at the MIT Media Lab. He is building new holographic interfaces and hardware that will push AR/VR technology forward. Vik is passionate about teams and technologies that shape how people interact with digital information and the physical world. He also directs new research projects as Graduate President of VRAR@MIT and member of the Space Initiative.
Nicole Stewart-Rushworth:

Nicole is the Immersive Lab Manager at Digital Catapult, the UK's leading advanced digital technology innovation centre. It drives the early adoption of digital technologies to make UK businesses more competitive and productive. She looks after the strategy and running of the organisation's five labs around the UK, as well as presenting on immersive at education days and workshops.

Tim Stutts:

Tim Stutts currently works as an Interaction Design Lead at Magic Leap, nurturing user experience for the OS and core applications. He is a multifaceted designer drawn to challenges touching on interaction, sensory feedback, data visualization, augmented reality and artificial intelligence.

Christophe Tauziet:

Christophe is a Design Manager, currently supporting the Rider team at Uber. Prior to that, he spent 5 years at Facebook, leading different teams, including the Social VR team with which he designed Facebook Spaces and other experiences.