Face-time with Grace


Realizing a living-breathing photoreal digital avatar is a tough task. But what if that photoreal avatar also needs to be something you have to interact with, in real-time, in VR?

Well, that’s exactly what John MacInnes and his collaborators did to make ‘Grace’, a real-time cyborg created as a demo project that was able to transcend the worlds of not only VR, but also AR and 4K video.

The project was designed as both an R&D effort and a proof of concept for a VR music video. In it, the character Grace, covered in jewels and a mix of real skin and cyborg metal, dances around a beam of light.

Crafting Grace’s realistic facial animation—a challenging aspect of any real-time avatar—would be made possible via a unique combination of Faceware Live (now Studio) and Glassbox’s Live Client for Unreal Engine.

These specialist virtual production tools allowed the facial animation to be streamed, live, and instantaneously translated to the digital model of Grace.

Making Grace dance (and sing)

MacInnes, who has been working on real-time humans for several years, started the Grace project in 2015.

“I self-funded the demo to show what a real-time photorealistic digital human character could look and perform like in 6DOF room-scale VR, something that would have been impossible with a traditional VFX pipeline. It was a music/dance performance I thought would tick most of the boxes.”

Several collaborators were involved, including producer Scott Gagain, modelers Lukas Hajka and Salim Ljabli, rigger Tim Coleman, who rigged Grace’s body, with the head rig coming from Snappers.

Simone Lombardo and Richard Hirata handled optimization and Unreal Engine integration. Vince Argentine at Rouge Mocap dealt with motion capture while Crispin Broadhurst was responsible for clean-up and additional animation.

“Meanwhile,” advises MacInnes, “Grace Holiday was the choreographer and dancer and I named the character after her in recognition of her amazing performance. However, Grace wasn’t just dancing, she also needed to sing.”

That’s where Faceware and Glassbox’s Live Client for Unreal Engine came in. First, MacInnes recruited Katie-Jo Turk to capture Grace’s facial performance. It was important audiences could establish an emotional connection with Grace in the VR experience, so a realistic performance was key.

Turk relied on a Faceware ProHD Headcam to record her face. As a first step there was a quick neutral pose calibration in the Faceware software and then it was streamed to Unreal Engine through Live Client.

The result was an automatically tracked video of Turk’s face streamed as real-time animation onto the CG model of Grace directly in Unreal Engine, where the experience was built.

By establishing the facial animation set-up in this way, the team could easily ‘tinker’ with the performance and make adjustments as necessary, since they could view it in real-time and in what would be near-final photoreal form.

Grace was first revealed at the 2016 VRLA, with audience members able to view the experience in HTC Vive headsets.

“There was a lot of buzz around VR back then but there wasn’t anything else like Grace at the exhibition,” attests MacInnes. “I was amazed that people waited for over an hour to have 4 mins with Grace, and it was extremely gratifying to see their faces when they took off the HMD.”
“I intended Grace to be a calling card rather than a final product,” continues MacInnes. “At one point I was thinking of releasing the experience on Steam but by then we had moved on to other projects. However, HTC Vive, AMD and Nvidia all showcased Grace around the world to demonstrate their products and promote VR.”

Connecting with Grace

The success of the project stirred a number of people to explore digital human VR experiences more widely, including MacInnes himself.

“Looking back,” he says, “I’m amazed we even attempted it, let alone succeeded! The technology was lining up, and at a price that was somewhat consumer friendly, but we were attempting to make something with a pipeline that we were working out as we went, in a medium that didn’t really exist before, and to a level that nobody had ever seen.”

Luckily the Grace experience had been specifically designed to work within the limitations of that time. For example, as an android, the character had no cloth sims and just shiny hard surfaces. Her environment did not require natural photorealism, and as a musical piece, her dancing could be somewhat otherworldly.

However, Grace had to remain expressive when she was singing, and that’s where MacInnes and his team were able to capitalize on the combination of Faceware and Live Client for Unreal Engine.

While there were many things to solve technically, one of the reasons this particular facial animation tool workflow was chosen was because, says MacInnes, it was accessible, fast to set up, and the easiest solution to use. “Faceware had the best solution for what we were trying to do, and Live Client for Unreal made it easier to tweak the rig motion logic live in the engine.”

“Glassbox’s Live Client for Unreal is like all great technology—deceptively simple,” adds MacInnes, who now runs MacInnes Studios, specializing in virtual beings, digital avatars and virtual production. “It turns something that until now has been perilously difficult into something intuitive. It’s a credit to the team at Glassbox that the problems they’ve solved will be invisible to Live Client users.”