For all the appeal virtual reality has an entertainment medium, one big factor has been lacking: Making the avatars appear more human, from appearance to their facial twitches.

That’s no longer the case.

Virtual reality is getting more human through the “Meet Mike” project, named in part after one of the research team. Developers at Epic Games, which is based in Cary, are unveiling technology at a conference this week that brings what they call “photorealistic interaction” to virtual reality. Working with Mike Seymour, a researcher at the University of Sydney Australia, Epic’s Chris Evans and Kim Libreri have published a paper about the transformation of VR avatars into more human representations.

The research has produced “one of the most realistic interactive real-time facial experiences yet seen,” the researchers say.

The results are being called “stunning” by some in the entertainment media where the results of their work are being demonstrated at the SIGGRAPH 2017 conference.

“Meet Mike uses the latest techniques in advanced motion capture to drive complex facial rigs to enable detailed interaction in VR,” the three write in the paper.

“This allows participants to meet, talk in VR and experience new levels of photorealistic interaction. The installation uses new advances in real time rigs, skin shaders, facial capture, deep learning and real-time rendering in UE4.” [UE4 refers to the game development engine developed by Epic called Unreal.]

  • VIDEO: Watch a demo of “Meet Mike” at https://www.youtube.com/watch?v=6MIkoLBWRv0

In a photo, Seymour wears production gear that makes him look like a Borg from “Star Trek: The Next Generation.”

“Algorithmic advances combined with extremely high resolution scanning will allow for the first time, participants … to meet in a virtual world with highly interactive and detailed life-like Avatars,” the team wrote.

Using the Unreal engine, which is one of the most advanced and widely used across the global video game industry, the team says they have cracked the technical changes to rendering avatars more human.

“Due to technical constraints, the research has been limited to the exploration of realistic face to face expressive emotional communication,” they write.

The Unreal solution

So they called on Unreal.

“The UE4 engine includes high-quality lighting features, new skin shaders and stereo rendering which produces crisp, detailed VR images,” they explain. “This is coupled with extremely complex and detailed facial and eye rigs, based on extensive input scanning from the USA and Europe. Together they provide one of the most realistic interactive real-time facial experiences yet seen.”

Building on Seymour;s research that earned a “Best of SIGGRAPH” award a year ago, the team says Meet Mike “allows users to experience the subtlest performances and renderings of a real-time character yet seen, with eye contact and complex nuanced emotional feedback loops. A group of people will be able to simultaneously meet in a VR space and experience a conversation between the advanced real time digital avatar.”

Creating the realistic human avatar has not been a challenge for the meek.

“The project highlights the latest advances in VR Stereo rendering, skin shaders and complex FACS based rig models inside the Unreal Engine,” the write.

“The system runs on a network of VIVE VR stations and provides a look at unparalleled engagement and virtual presence, while illustrating multiple levels of facial tracking and rendering.”

Researchers from numerous other companies, including Disney, and at the University of Southern California also participated in the project.

Highlighting the challenges

A series of bullet points included in the paper highlight the complexity of the project:

• FACS real time facial motion capture and solving

• Models built with the Light Stage scanning at USC-ICT

• Advanced real time VR rendering

• Advanced eye scanning and reconstruction

• New eye contact interaction / VR simulation

• Interaction of multiple avatars

• Character interaction in VR at suitably high frame rates

• Shared VR environments

• Lip Sync and unscripted conversational dialogue

​ • Facial modeling and AI assisted expression analysis.

Read the full report at:

http://dl.acm.org/citation.cfm?id=3089269.3089276