A colleague once told me, “Talking about VR is like dancing about architecture.” It’s hard to convey the experience of virtual reality through words and pictures. It’s even harder when you add touch to the mix.

Outside of experiencing HaptX Gloves first-hand, we’ve found that video is the best medium to communicate how they feel, what they enable, and why they’re special. When we announced the HaptX Gloves Development Kit last month, a video trailer was a key ingredient to spreading the word.

We didn’t expect producing this video would be such a fascinating undertaking. In this blog, we’ll take you behind the scenes of how we overcame unfamiliar software challenges, smashed UFOs in our office, demoed to celebrities, and edited together 62 seconds of footage that we’re proud of.

Choosing an approach, building a team

We looked to the lessons from our 2017 launch video, which featured our HaptX Glove prototype. The key goal for that video was to emphasize the realism of the touch sensation. To achieve this, we intercut footage of our HaptX Glove prototype with real-world touch interactions. For example, a gloved finger presses down, then an ungloved finger hits a real piano key.

While this was successful in communicating the realism of the touch sensation, this time it was crucial to show the gloves interacting with real VR environments. After careful evaluation, we decided the best way to show the HaptX Gloves Development Kit in action was through a technique called “mixed reality” filmmaking.

Mixed reality (MR) filmmaking (not to be confused with mixed reality headsets) is a technique that super-imposes a real-world VR user into the virtual environment, creating an eye-catching blend of the physical and digital. Companies like HTC have employed it in their promotional materials, and from what we’ve seen, this production style is the best way to convey VR from a third-party view.

Our HaptX Gloves announcement video from 2017, featuring the prototype HaptX Glove.

Producing a professional MR video requires a rare combination of technical savvy in VR software and traditional filmmaking chops. Not many people are experienced with this production technique. Then again, not many people are like Az Balabanian.

Az is a renaissance man in today’s experience age. He fuses art and science into each of his creative undertakings (of which there are many). Az produces photogrammetry scans, captures aerial video with his DJI drone, and hosts the Research VR podcast* – all in a day’s work. When we contacted Az about directing our HaptX Gloves DK video, we were thrilled that he was up for the task.

Compositing live with LIV

As we planned the production with Az, we were presented with another creative choice: do we composite the video in post-production, or do we composite it in real-time? Many high-end MR videos use meticulous editing, rotoscoping, and CGI compositing in post-production to make the end result perfect. While this may lead to a more beautiful end result, we felt compositing afterwards could come off as dishonest. The audience wouldn’t know the extent of the CGI – How much of it was faked?

To maintain the integrity of the video we decided that all compositing would be handled in real-time as we filmed. Aside from adding on-screen text, we did not composite footage in post-production. Additionally, we would film using a functioning HaptX Gloves Development Kit (no props), and feature VR environments developed with the HaptX SDK in both Unity and Unreal. Thankfully, we found the right tool for the job: LIV.

Early test footage shot in our San Luis Obispo lab.
Early test footage shot in our San Luis Obispo lab.

Early test footage of live compositing with LIV and our HaptX Gloves Development Kit.

LIV is a tool that merges game engine environments with green screen footage captured through a live camera. Mounting a HTC Vive Tracker onto our camera allowed us to capture and sync camera movements between physical space and the virtual world.

We enlisted Tyler Hushing, a software engineer on our team, to oversee the technical side of the production. “Once we decided to use LIV for the compositing, I started talking to their team through their Discord channel,” said Tyler. “LIV’s support was astounding. Their team is based all over the world, so anytime I needed to ask a question, there was always someone to talk to. We had both Unity and Unreal projects featured in the video, and they were incredibly helpful with integrating their plugins for both engines.

Let the testing begin

As we planned the video, it dawned on us that we were doing something that had never been done before. We were introducing a brand-new hardware to an already complex production process. Would any of this work? If it did work, would it look cool?

Early test footage shot in our San Luis Obispo lab.
Early test footage shot in our San Luis Obispo lab.

Keenan, one of HaptX’s software engineers, smashes UFOs in our office and handles a virtual spider with his hands. 

To find out, Az stopped by our San Luis Obispo lab to test this software compositing with his camera rig. This was the moment of truth. If this didn’t work, we’d have to go back to the drawing board and produce a whole different video.

Fortunately, luck was on our side: not only did everything work, it looked awesome. We were ready to shoot!

The film shoot itself took place in Los Angeles at VR Scout’s green screen studio. As we set up equipment, a few of our friends swung by to get an early look at HaptX Gloves. Among them: Reggie Watts, the amazing comedian, musician, and technologist.

“The future of cinematography”

“Figuring out how to visually show a sense of touch was quite a creative challenge,” said Az Balabanian. “By using Mixed Reality camera tools like LIV, we were able to test shots and ideas on site, and since everything was composited in real time, we were able to tweak camera movements, lighting, and the blocking to get the shot we wanted. It really felt like we were working on the future of cinematography.”

The video features our HaptX Gloves interacting with real VR content that our team created. We showed three experiences: a fire pump panel training scenario, an automobile that emphasizes design applications, and a farm environment that shows the entertainment potential of HaptX Gloves.

If you pay close attention, you can get a glimpse of how HaptX Gloves work. When a hand squeezes the wrench, blue beams indicate the force feedback that’s applied to the fingers. When the hands rest against the grass, you see the 260 tactile actuators between the two gloves apply pressure against the user’s palms and fingertips. When the index finger taps the edge of a star, you see the accuracy of our magnetic motion tracking system that tracks hand movement with sub-millimeter precision.

While this video is only a minute long, we packed it full of content featuring VR experiences we made in-house. Looking ahead, as our customers develop with HaptX Gloves, we hope to show off their amazing work in videos to come.

Big thank you’s to Az Balabanian (Director), Austin Smagalski (Editor), Jonathan Nafarrete (VR Scout Editor and Studio Manager) for making this video possible. Subscribe to our YouTube channel to keep up with the latest videos from HaptX.

*The Research VR Podcast recently featured Jake Rubin, HaptX Founder and CEO. Click here to listen to the conversation.