Di Dang leads the Emerging Tech group at POP, a Seattle-based digital agency that develops technology solutions for enterprise clients. She’s worked there for 2 years and has designed the UX for a variety of VR and AR experiences. She also teaches at the School for Visual Concepts. We sat down with her to talk about UX design for VR, AR, MR, and using haptics.
HaptX (Greg): Thanks for taking the time to sit down and discuss what you’ve been up to over at POP. Before we jump in, can you tell me a little about your background? How did you start in UX design?
Di Dang: I started in UX almost 5 years ago. At the time, I was in Berlin working for early-stage tech startups. I started out in sales and account management, and as much as I learned from that day-to-day work, I realized I was missing working with the end-user. I wanted to work directly in shaping products and services. I started leveraging my network in the startup community to create opportunities for UX work.
Getting Started with UX Design for VR and AR
HaptX (Greg): How did your interest in VR/AR/MR evolve? Was there a decisive moment in driving you to the field?
Di Dang: I really started looking into VR in early 2016. As a part of being a UX designer, it’s important to me to be connected to the local user experience design community. I noticed more and more AR/VR meetups crop up, like Seattle VR and VR/AR Collective. I had initially dismissed the VR hype, but as I got more involved, I began to realize the potential and applications of VR/AR beyond what was already being explored in gaming and entertainment. It made me realize how exciting it was to be part of this new frontier in defining what the standards would be for the platform.
HaptX (Greg): POP is among the agencies leading the charge in developing VR/AR/MR experiences for clients. Can you tell me a little bit about how that evolved? Was it driven by client interest or an internal effort?
Di Dang: POP has been around for 21 years, and over the course of our lifetime, we’ve gone from doing chiefly broadcast and print, to web and mobile, and now to work with VR/AR, conversation UI, and machine learning as well. It’s part of our DNA to invest in emergent technologies that have the potential to solve business problems for our enterprise clients. Our work with VR/AR started about 2 years ago with POPsters who were personally interested. They saw the potential and then grew it into a larger initiative. Dave Curry, our former VP of Emerging Tech and Trends, first led the charge. We were one of Microsoft HoloLens’s first partners in its Mixed Reality Partner program, meaning we were among the select few welcomed into the fold from the onset.
Anytime we’re consulting with our clients, we make sure we’re solving their business problem with the best-suited creative and technological solution. Their business problem may best be solved by a campaign, application, VR/AR, or another platform altogether. The key is that we remain agnostic in the beginning phase. I have a personal passion for VR/AR, but it’s most certainly not a hammer for all nails.
Haptics in UX Design for VR
HaptX (Greg): In the General Assembly panel you recently spoke at, you also brought up haptic technology. Can you speak a little to the importance of haptic feedback in user experiences?
Di Dang: We’ve practically nailed visual and audio displays, so I think haptics represents the final frontier of immersion and presence for a user. Sure, there are display issues with fidelity and FOV (field-of-view) and the screen door effect, but based on sight, the user still feels like they’re present. Haptics will enable you to reach out and physically interact with or affect the environment or UI—and feel it respond in turn. That’s what’ll make VR and AR feels like it’s truly here.
HaptX (Greg): Have you worked with haptics much?
Di Dang: Any work we’ve done with haptics has been in the realm of controllers. Haptics, as most consumers understand it, is vibrational. It’s not natural or realistic. Vibrational haptics can help draw a user’s attention to a particular event or UI element, and our client needs haven’t necessitated a more realistic haptic experience.
HaptX (Greg): Do you think there’s a future without a mouse and keyboard? Or alternatively, do you think there’s a killer interface or set of gestures for VR similar to what’s now ubiquitous in our mobile devices?
Di Dang: There’s got to be a future without a mouse and keyboard, especially as we move into this new medium. I’d be disappointed if there wasn’t. We’re already doing away with the mouse, in part, because a lot of experiences rely on gaze or raycasting. Especially with mobile VR, where you sometimes don’t have a controller.
The core reason we have a keyboard is for text input. It’ll be interesting to see how that’s solved, because we need a way to input text and to login or authenticate ourselves. If you’ve tried typing in VR, you know it is incredibly painful. The useful thing about a keyboard is that it gives you haptic feedback to tell you that it has successfully read your input. There’s not really any controller or peripheral that has that same haptic satisfaction.
In terms of a set of gestures, this is where UX design and research comes in. It’ll be interesting to see what designers can do to define a set of gestures. It’ll also depend on the experience or application. Visuals cues in VR or AR don’t need to be photorealistic, but I think users will expect more realism out of touch. It’ll also depend on the motivations or expectations of a user. For example, I expect different things when I’m reaching out in VR to touch an elephant in Africa versus watching a video in 360 on my couch. We might see a common unified set of button activations, as Angela Sharer from HTC Creative Labs put it. It might be a combination of utility-driven haptics and experiential haptics.
The Future of UX Design for VR
HaptX (Greg): Have you given much thought to ARKit?
Di Dang: We are definitely interested in ARKit and have been working on a proof of concept, which we’ll eventually share with clients. It’s exciting times for us as VR/AR creators, who now have the framework to more easily build third-party applications. And with augmented reality hitting a mainstream audience, something on the scale of 500 million iPhones in the next year, this means that AR will have reach unlike anything we’ve seen before in the industry.
HaptX (Greg): What are a few of your biggest surprises in your research of UX for VR/AR/MR?
Di Dang: In terms of the biggest surprises, I think early on was focal depth and understanding where certain UI or visual elements should be placed relative to a user. 2D and flat user interfaces don’t require you to think about that. I spent a lot of time prototyping things in Unity, creating a build, and then putting it on people’s heads to understand what they’re experiencing in terms of comfort and usability. It required an understanding of stereoscopic and visual cues. I had to dive deeper into how vision worked. I definitely wish I’d paid more attention to human biology in high school. We also discovered certain guidelines around UI elements. For example, an interface element might need to be 1-2 Unity units away, or if it’s a background, 6 units.
Another big surprise was understanding the fine line between attention and action. For example, in a mobile VR experience, we have to user test what requires a user to take action. We were worried about how much you had to make a user work to do something. We didn’t want to create too much friction. We tried to think about how we might use gaze or ray-casting to implicitly select something versus explicitly selecting a control. We did some early exploration about where we placed UI elements. Initially, we had panels associated with the peripherals, like controllers. We assumed that if someone looked at their peripheral, they meant to engage with it. I thought that would be successful because it would create less work for a user and would anticipate their intent. In user testing, though, we found users actually hated it because they’d look at their controls accidentally, or it would occlude what they wanted to see. Trying to anticipate user intent was much more challenging in VR and AR.
HaptX (Greg): When you think about UX design or VR/AR 5 to 10 years down the line, what most excites you?
Di Dang: It’s difficult to envision the UX of VR or AR so far out, given how fast it’s changing on even a monthly basis. How will things change with light-field imaging and neural interfaces? Not to mention the advances happening with artificial intelligence and machine learning—especially with development of more emergent, less authored VR/AR experiences. I imagine we’ll see a lot of streamlining or removing of UI, as well as voice becoming a much more dominant method of input.
I’m excited for when the technology becomes more mainstream and the price point becomes more affordable so that every household can have a device. It’s really exciting for designers when the technology is no longer a novelty and has greater ease of access.
Even within just the next five years, I’m excited for the evolution of interfaces; hopefully we move beyond porting over web, or flat, design elements into VR/AR, or 3D, interfaces. We’re seeing a return to skeuomorphism now. I’m excited to see users get more comfortable with gestures in VR and AR, since most of them feel awkward and unintuitive right now.
UX Design Resources for AR and VR
HaptX (Greg): Finally, are there any resources you’d suggest for UX designers looking to dive into VR/AR/MR?
Di Dang: Oh, there are so many. Here are just a few for starters:
Alex Chu’s introduction on transitioning from a 2D to 3D design paradigm: https://youtu.be/XjnHr_6WSqo
Make It So: Interaction Design Lessons from Science Fiction. This is so good. In addition to learning from what doesn’t work through user testing, Noessel and Shedroff show what we can learn from what doesn’t even exist yet in the UI of speculative fiction.
In particular, the haptics chapter: http://rosenfeldmedia.com/books/make-it-so/
I also want to mention Adrienne Hunter’s writing is key for people getting into the field: