A designer's guide to mobile AR and VR
|Paul Reynolds in Augmented Reality Thursday, August 16, 2018|
A guide for designing mobile AR and VR applications and how companies should make the transition from a 2D to a 3D mobile user experience for better user experiences.
When most people think of 3D user experiences, their mind goes straight to headsets - Oculus Rift for VR, for example, and Hololens for AR. But for designers and developers, that mindset changed dramatically last summer when Google and Apple released their mobile AR platforms - ARCore and ARKit, respectively. Almost overnight, these platforms swung the center of gravity for 3D UX design away from headsets toward already ubiquitous mobile devices.
This seismic shift has sprung new fault lines in the 3D UX design landscape. For one, ARCore and ARKit have sparked interest in 3D from a massive cohort of designers and developers with experience building 2D mobile applications. We’re talking a roughly 4.5 million-strong community that suddenly finds itself empowered to make a dent in the 3D universe.
But there’s a lot of chaos and confusion, too. From a designer’s perspective, what are the differences between VR, AR, and mobile AR? What types of devices and applications are associated with each? What unique characteristics? The following is an effort to untangle these closely intertwined disciplines and shed light on the gray areas that may confound 3D novices.
The 3D opportunity for mobile AR
The opportunity for mobile AR designers is enormous. Many of the UX patterns and design conventions that will come to dominate early immersive 3D applications will likely originate in mobile AR first. And mobile app designers will likely be the first to explore these possibilities, armed with a new crop of tools for designing in 3D and with well-established 2D UX design workflows.
For the foreseeable future, mobile phones, tablets, and phablets will make up the vast majority of devices with 3D six degrees of freedom (6DOF) capability. (6DOF refers to the freedom of movement of a rigid body in three-dimensional space.) As a result, it is important to understand the significant differences between designing for 3D versus for 2D. These differences will affect everything you do, including what you design and the tools you will need.
VR vs. AR
The most basic distinction to draw here is between VR and AR. VR typically puts us in a synthetic, fully immersive world, using a headset that tracks head movements and blocks out the real world. By contrast, AR uses smartglasses to preserve our ability to see the real world enhanced with 3D, contextually aware graphics. Where VR tends to be a private experience in which the user is usually tethered, AR is often designed to be used in public and encourages movement.
It’s worth noting that not all AR/VR is created equal. Some devices offer a full range of motion with 6DOF (elevating, strafing, surging, rolling, pitching and yawing), while others offer a lesser range with just 3DOF (rolling, pitching and yawing). And a word of caution regarding 3D file types; they’re a muddle. No matter what 3D format you are designing for, you’ll run into a headache-inducing lack of file standards. Finding and creating 3D assets - a lion, helicopter, or piece of furniture, for example - is painful and expensive, although standards are emerging (most designers buy them from TurboSquid, as we reported in our white paper). 2D assets remain a staple of VR and AR, but they play an especially key role in mobile AR.
Common dedicated VR devices include the Oculus Rift, HTC Vive, and Sony PlayStation VR. Dedicated AR devices include Microsoft HoloLens, Meta 2, and the upcoming Magic Leap One. Hybrid mobile VR devices that combine a headset with your smartphone include Google Daydream View, Samsung Gear, and Panasonic 3D VR. According to research firm CSS Insight, sales of VR and AR headsets and glasses will hit 22 million in 2018, with mobile VR accounting for the bulk of those sales. That number is expected to increase fivefold to 121 million units in 2022. These numbers will be dwarfed by the quantity of AR-compatible smartphones, which ARtillery Intelligence estimates will be close to 2 billion this year and will top 4 billion by 2020.
What is mobile AR?
Mobile AR is a hybrid of 2D and 3D characterized by “screen-locked” or fixed elements. With mobile AR, any object in 2D space can interact with, or provide a UI to interact with, any object in 3D space. Currently, the traditional mobile UI has a prominent place in mobile AR apps, but this will lessen over time as more functionality moves into 3D. That said, the screen as the locus of user interaction will not change in mobile AR.
Movement is a key component of mobile AR - even more so than with many AR headsets. This is something that’s underscored in Google’s recently released Augmented Reality Design Guidelines. It’s one reason mobile AR puts the camera front and center of the experience both for users and designers. Users of mobile AR apps no longer simply cradle a device; they constantly re-position it, and the device can be in motion even over long distances. As with AR, mobile AR makes the real-world environment a key element of the experience. Lego AR Studio is a great example of an app that embraces the promise of mobile AR. It offers excellent on-screen controls for both the app and 3D objects. It incorporates multiple objects in the scene, specifically objects made of Lego, and it encourages interaction with the real world.
Mobile AR also tends to have a strong social component. It not only allows you to share your screen but also enables people to gather around a common point in space. This is something designers should keep in mind.
Also important to remember, loading big 3D models onto mobile devices is computationally expensive - which will lead to rapid battery drain and a hot device in your hand.
A word on mobile AR apps
Current mobile AR apps tend to be stickers (such as Google’s Stranger Things characters) or single-object, single-scene apps. However, designers are evolving toward more complex experiences. A good example of a single-object, single-scene app is Curate by Sotheby’s, which has excellent integration with 2D. Designed to help you visualize your future home, the app lets you position and view 3D objects such as sofas, tables, and chairs. If you see something you like, you simply select the 3D object, which pulls up web content in a screen-locked 2D UI where you can complete the order.
A good example of a multi-object, single-scene app is the IKEA Place app, which lets you virtually place IKEA products in your space. Multi-object, multi-scene apps will allow designers to build compelling apps with complex interactions and information architectures. They’re still over the horizon but will be here soon.
There are several different app types. Table-top apps are those where all interactions are contained within a bounded space, often to scale. Typical examples are educational apps, such as the WWF Free Rivers app, which helps students understand the ecology of rivers.
And then there are those very rare apps that offer a fully immersive experience. In a sense, these can blur the lines between AR and VR - for instance, a model of a room at scale in mobile AR looks very much like VR or AR on a headset. However, the stereo display that’s characteristic of VR gives the user a stronger sense of scale, depth, and spatial relationships. And with mobile AR the entire experience is always mediated through a screen - a window through which to view the experience.
This content is made possible by a guest author, or sponsor; it is not written by and does not necessarily reflect the views of App Developer Magazine's editorial staff.