Batman Mixed Reality Demo

At this year’s Mobile World Congress (MWC) it seemed every mobile network operator on the planet was hocking the next hotness in data connectivity, the magical alphanumeric ‘5G’. It’s true 5G is slated to make way for plenty of changes in how users consume mobile content thanks to a dramatic increase in bandwidth and lowered latency, but if you’re salivating over the possibilities for what that means for VR gaming in the near future, you may want to step back a bit.

Cloud gaming isn’t a new concept, at least in the world of traditional flatscreen games. Nvidia has GeForce Now, Sony has PlayStation Now, and both Google & Microsoft have their own future cloud gaming projects in the work too. While the infrastructure around gaming-focused edge computing is still very much in its early stages, requiring companies to maintain servers as close to the end-user as humanly possible, the hypothetical benefit to gamers is obvious. Extremely low-powered computers can stream games only previously available on the best of the best rigs.

Mobile network operators like Verizon, Sprint, AT&T, T-Mobile, Vodafone and many others are signing on to paint the world with 5G starting this year. An increasing number of users will soon have access to data faster than even at-home fiber optic cables can provide, making the migration from wired to wireless almost a forgone conclusion in the minds of many. To wit, some of these companies publicly showed off VR cloud gaming actually working for the first time—one of the most difficult problems due to the inherent need to keep VR games chugging at or below the 20ms latency threshold, which is considered the bare minimum before users notice anything.

Stepping into Ericsson’s enormous MWC booth, second only to Huawei at over 6,000 mt² (~65,000 ft²), I cautiously ambled over to an abandoned station outfitted with a Vive Pro and Vive Wireless Adapter. The booth attendant claimed the “mixed reality” Batman experience, which was built in partnership by AT&T, Ericsson, Warner Bros. and Intel, was delivering a total latency between 4 – 6 ms through their mock-up 5G network. That’s basically the bare minimum you can expect, so I was excited to pop in and see for myself.

Although the real-time rendered experience didn’t suffer any discernible latency, it was an absolute failure at demonstrating why VR users want cloud gaming in the first place. In short: it was hot garbage.

With video from Vive Pro’s passthrough cameras placed as a backdrop behind my head (that’s totally “mixed reality,” right?), essentially what I experienced was a 180-degree mess. I was treated to extremely low poly graphics that looked about on par with what can be accomplished on a mobile VR headset like Gear VR or Oculus Go. Adding insult to injury, the two-minute experience, which featured Batman stopping the Scarecrow from—no joke—using 5G for evil,was presented to me in 3DOF and not in the full positional tracking Vive Pro was capable of. I was also told there was an interactive bit using a single Vive controller, but the booth attendants removed it because “nobody understood what to do.”

After seeing it in 3DOF and without any level of interactivity, I was pretty skeptical whether it was actually real-time rendered experience or just a 180-degree stereoscopic video. I was assured it all real-time.

I imagine this was done for the singular reason of showing the setup’s lowest possible latency. It’s not an unsubstantial achievement from a technical aspect either, but low latency is as good as useless if this is the sort of toothless VR content AT&T, Intel, Ericsson and Warner Bros. thinks will fit into a real-world use case. Cutting literally every possible corner on content to get latency down to something you can proudly advertise as ostensibly solved borders on willful deception.

There was a company at MWC pushing a more realistic version of VR cloud gaming though, warts and all. Two days earlier I got a chance to visit HTC’s booth where they were showing a similar setup streaming Superhot VR (2017) to a Vive Focus Plus over a mock-up 5G network. Although the implementation was far from perfect, it at least showed real SteamVR content running in the cloud, and delivered in 6DOF like you’d expect.

HTC’s streaming latency was well above 20ms, and it seemed to be heavily relying on time warp to keep things smooth. To me, it further drives home the fact that even in controlled environments with purpose-built networks completely dedicated to the task of remotely rendering VR games, there’s still a long way to go before we get plug-and-play VR cloud gaming.

While Ericsson’s demo failed to accurately sell the core idea behind the technology, it did manage to unwittingly reveal that VR cloud gaming is going to be an extreme balancing act when it comes at some point in the future.

Leave a Comment