VR 2017: Technology Hurdles and Industry Opportunities

VR 2017: Technology Hurdles and Industry Opportunities
by Chris Wren

As we march into the beginning of 2017, let’s take some stock in what has happened this past year, “2016: The Year of VR”, look ahead at how the market will develop, and then drill down to see what this means for employment and business prospects within the sector. The dawn of consumer VR happened last year, with three major premium headsets launched and we are now talking about millions of premium VR customers for the first time.

Oculus Rift & HTC Vive

In the middle of summer 2016, we had two major VR headset releases. It began with CES this time last year when Oculus Rift announced pre-orders for their headset, and that it would include an XBox 360 controller and basic positional tracking up to 9 feet with a single camera. Around the same time the Oculus Rift should have shipped, HTC Vive announced that their headset was shipping. The race was on. Part shortages and manufacturing delays from Oculus put some customers almost two months past their shipping window to receive their Rifts. Oculus was kind enough to refund the shipping costs to those who had to wait, but the delay was costly in terms of public sentiment. What arrived by early summer was a lightweight high definition VR headset with built-in speakers, and a modest selection of games including two exclusives for pre-order customers (Eve Online & Lucky’s Tale).

[Tweet “”The dawn of consumer VR happened last year, with three major premium headsets launched.””]

HTC Vive picked up the slack in the Oculus chain and had plenty of Vives to deliver in less than three weeks from the time of ordering. Unlike Oculus, the Vive came with robust positional tracking in the form of the Lighthouse, complete with a “chaperone” system to keep players from running into walls. This enabled impressive walk around VR experiences which were hard to match on the Oculus. Also, HTC Vive came with two positional tracked six DOF (degrees of freedom) controllers at launch. Something Oculus had promised, but had yet to deliver until late last year. We had some nice exclusives from HTC Vive available with pre-orders as well (Fantastic Contraption, Tiltbrush, and Job Simulator).

On the platform side, one difference between Oculus and HTC Vive was the nature of their digital distribution platforms: the Oculus Store and Steam. Steam has made itself headset agnostic and had happily supported both Vive and Rift versions of games on its online store, as well as supporting the headset in its Steamworks SDK, and even allowing the tracking of Oculus Touch Controllers using SteamVR. Oculus on the other hand has kept its store exclusive to Oculus only versions of games and apps.

Sony Enters VR

The PlayStation VR launched in the US in October to largely positive results. It has the advantage of being a much cheaper solution that the premium HTC Vive or Oculus Rift headsets. An install base of over forty million PlayStation 4 owners only requires an additional $400-$500 investment to gain a very robust VR experience. Furthermore, Sony brought the games. The Sony library of complete game experiences rivaled both HTC Vive and Oculus right out of the gate. It is still very early in this product’s release, but it looks promising, and while not the premium PC VR experience, it is a major step up from the mobile headsets and very accessible to a large market.

[Tweet “”The Sony library of complete game experiences rivaled both HTC Vive and Oculus out of the gate.””]

Microsoft VR for Windows 10

Just recently, Microsoft announced a “line” of Six DOF VR headsets for Windows 10 starting at $299 from manufacturers like HP, Lenovo, Dell, Acer and Asus. This is half the cost of Oculus and Vive headsets (minus the controllers). This headset is tethered to a PC, but does not use a tracking system like Vive’s Lighthouse or Oculus’s positional tracking system, it uses “inside-out” tracking like its high-end AR counterpart the HoloLens. Tracking in the demo seemed passable at the time, but whether it can compete with the accuracy of the Lighthouse system remains to be seen.

Chinese Tethered VR

There are dozens of PC-based Chinese headsets coming onto the market. Whatever interest the US and Europe may have toward VR, China has it times ten. There are several PC tethered headsets, some with very impressive specifications outpacing the FOV (field of view) and resolution of the Oculus Rift and HTC Vive. Inclusion of headphones, controllers and tracking setups vary among these. Some of the more notable examples are: Pimax, Pico, Dee-Poon, 3Glasses.

Social VR

Social VR took a few steps forward in 2016, but don’t cancel your Snap account just yet. Notably, there were really nice initial offerings from vTime, RecRoom and BigScreenVR, all which allow players to take on an Avatar persona and to chat online with other people in VR spaces. Very recently we saw a nice glimpse into the future of Social VR with the Oculus demo at OC3, where Mark Zuckerberg donned a Rift in front of an audience, connected with some of his co-workers in a virtual chat room, immersed in a 360 video surrounding of Zuckerberg’s home, where they played games, brought in a video call from his wife at her work and then took a VR photo of himself, his wife on video conference, and his dog who was in a live 360 video feed sitting on the couch at the Zuckerberg estate, and then he posted it to Facebook.

VR Development

Both Unity and Unreal Game engines tried their hands at creating a development environment in VR. Between these two, the Unreal toolkit is a little further along, but both need a lot of design and improvement, and neither of these seems like it has surpassed the productivity of a traditional development environment. Nonetheless, they both hold some promise for a time when we will prefer to develop with the headsets on.

Cryengine is also worth noting in the game engine space. While its toolkit is arguably less friendly than some of its competitors, it has put forth three solid VR offerings which have all been competitive and visually compelling: The Climb, Robinson: The Journey, and Ark.
 

Looking Ahead

Cutting The Wires

We have mobile wireless VR already with cardboard, Gear VR, and a hundred headsets in between, which all use your phone as the screen and computer to drive the experience. These have been successful in getting a basic VR experience out to a large audience (on the order of twenty million cardboard-type headsets and over two million Gear VR headsets). However, for anyone who has tried one of the premium tethered experiences like the Vive or Oculus, the current mobile experience falls short in a couple key areas. Notably, it does not yet have 6DOF (six degrees of freedom), and the ability to track both positional and rotational movement of the user. Phones have many sensors like gyros and accelerometers which make tracking rotational data possible, but the push lately has been to figure out the positional tracking for mobile VR.

There are several solutions on the market already. We’ve seen offerings from Intel, Occipital in the past year, and Oculus just demonstrated their “Santa Cruz” headset to reporters at OC3 which uses “inside-out” tracking, similar to Microsoft’s HoloLens. Graphic processing is the other major drawback to current mobile experiences. They just cannot compete with their PC tethered counterparts. Frame rate is so critical to VR that to achieve this on mobile requires many sacrifices to lighting, textures and geometry to achieve a passable experience. Mobile processors are getting better all the time, but there is still a large gap to fill to marry the PC and mobile experiences in terms of fidelity.

[Tweet “”Graphic processing is the other major drawback to current mobile experiences.””]

Wireless may hold the promise of solving both of these problems, but it remains to be seen. There are millimeter wave solutions to tracking which may allow for very accurate tracking of high end VR experiences in large open spaces. Also, we may bridge the gap soon in wireless transmission to allow for a 90 FPS (frames per second) experience to stream from a nearby computer, allowing the user to be untethered and still using a high end GPU (graphics processing unit) to drive the experience. There are several companies working on both of these solutions right now and this year will see many of them come to market.

The Move Toward Standalone Mid Tier VR

Many companies are currently working on standalone wireless VR headsets. These are a mid tier solution on par with Samsung’s Gear VR in terms of fidelity, but not requiring a phone. Google Daydream is probably the biggest validation of this class of headset. Their development of a phone, a headset and a platform is a large investment in hopes of upgrading the cardboard class to a new level. Included with the daydream headset and SDK is a 3DOF remote-like controller that allows for point and click in VR. There is no plan for Daydream to support positional tracking at this time.

Similarly, Intel’s “Project Alloy” is a mixed reality standalone headset. Alloy is like the Microsoft HoloLens, but instead of pass through, it uses stereoscopic video feed for viewing the real world in a VR environment. Intel has said this will be open source early this year, and will allow other manufacturers to create versions of their own. No price has been mentioned for this but with the addition of AR and tracking, it will likely be above the average mid tier class which will should be in the $300-$400 range. Tracking on this headset will use an “inside-out” solution.

[Tweet “”Google Daydream is probably the biggest validation of this class of headset.””]

Tap The Senses

There are a few areas of sensory involvement within VR which saw a lot of activity last year. Audio and visual with some basic haptics have been a mainstay within videogames for decades. VR developers are now trying to figure out ways to get at the other senses and to enhance our experience of the ones we already have access to. We’ll see a number of first wave products this week at CES 2017.

Lightfield Synthesis

On the visual spectrum, there are companies like Magic Leap and OTOY working to bring lightfield synthesis within the headsets. Lightfield synthesis is the “other” data in the visual spectrum we’ve largely ignored in PC experiences thus far. We haven’t missed it because we haven’t experienced it yet outside the real world. It is the capturing of how light interacts in an environment. How it bounces off of objects, and what angles the photons are presented to the eye. This is critical to creating a life like and believable scene. One that your brain accepts as real. A company called Lytro is at the forefront of creating a 360 degree camera that captures and stitches all of this complex data. While the promise of this being included in all future VR experiences is exciting, there are still hills to climb here to optimize this massive visual data set to run in a real time environment at an acceptable frame rate.

Touch

There have been many attempts to improve our sense of touch in VR this year. While in 2015 it was the “Tesla suit” and “UltraHaptics”, last year we were treated with “Skinterface” and “Sub-Pac”, solutions which use sound to simulate physical interactions. Several companies are working on gloves and other localized haptic solutions to help enhance our sense of touch. Both the HTC Vive controller and the upcoming Oculus Touch controllers have some basic haptic vibration built into them. While this is not new, it does help with some sense of presence in VR. A company called Tactical Haptics has put together a very convincing set of controller enhancements using “strafe” haptics to give a sense of weight and tension to VR interactions.

[Editor’s Note: See the recently released sensor shirt from Xenoma]

Smell

The offerings this year for smelling in VR have not really gone much beyond the “Smell o Vision” experiments of the early 1960s. The “Nosulus Rift” is a real thing, and thanks to the creators of South Park, we can now smell farts in VR. The Feel Real “Nirvana” is another peripheral which adds smell as well as wind and heat to your VR experience. These efforts, while interesting, make it clear that smell has a very long way to go before it finds a home in the consumer VR market. That said, the olfactory bulb is the first to develop in our brains. Our sense of smell is more powerful to our memories and experience than any other, so this is a goal worth pursuing.

Movement

There are three major categories of effort in simulating movement in VR: Treadmills, vestibular stimulation and mixed reality.

The first is the treadmill solution. Products like Omni, and Infinitrak are making tethered environments that allow you to walk and run in any direction with a headset on, while remaining in a very small area. These work and in some cases are not too bulky or expensive, but all of these solutions have limitations which reveal their tethering, and there are some safety issues even with the devices that have harnesses to keep you contained.

The second solution for movement is vestibular stimulation. This is a technique which originated with electrode stimulation just behind the ear which fools your inner ear into perceiving movement. Samsung has demonstrated a means of doing this using only sound waves and a pair of headphones. Imagine simulating falling, flying, driving, or earthquakes while in VR without the player requiring a lot of space or special contraptions. The promise here is huge for immersion, but we are still early on this technology, and it will be exciting to see what this year brings in this space.

The third solution is actually having you move around in a mixed reality setting. You will walk, drive or ride somewhere, all while staying in your virtual environment. To get this to a viable state involves very accurate tracking in open environments, using techniques similar to driverless car technology, and will also require some advancements in scanning and processing technology. This is probably the furthest technology out there because of the number of dependent solutions needed to get there, but seems the inevitable path that AR and VR will take in the years ahead.

Scanning 

Google Tango, SoftKinetic and Occipital brought depth sensing to AR and VR experiences in the past two years, but last year we started seeing the first phones to have it built-in. The technology used here is called “Time of Flight”, and it is a solution which uses light beams to establish distances of objects within the visual spectrum. LIDAR is an example of how this has been used in the past. Now it’s being used in real time to scan environments, objects and people. Capable of not only capturing the shape of things, but also their colors, and texturing them very realistically. This technology has the power to recreate what it sees in the visual spectrum.

The value of scanning the shapes of things means that we can interact with them in VR. For example, scanning in a keyboard, or a pen would allow someone in VR, without seeing the “real” world to pick up and interact with objects in their environment without leaving VR. Also, elements within a VR experience can interact with the environment. Microsoft’s HoloLens Project X-Ray demonstrated how gaming and software elements can interact with furniture and architecture. It will be a few years before we can scan, optimize and render everything in real time, but the goal is there and there are many companies working toward it. Michael Abrash used the term “Augmented Virtual Reality” in his keynote at OC3 to describe this process.

VR clearly has the ability to engage us, and it is not just the entertainment sector which is benefiting from this. Of course games and movies will be advancing heavily in VR in 2017, but the benefits of VR reach beyond entertainment into fields such as health, construction, training, and shopping.

Hollywood

Directors like Steven Spielberg (Ready Player One) and Jon Favreau (Gnomes and Goblins) are dipping their toes into full VR movie experiences and validating this medium. Hollywood is taking this seriously. They recognize it as a new medium, one which evolves their current storytelling, venturing off into new areas where the rules are not yet defined, and one rich with new experiences to be explored.

Construction

Architects, city planners, and anyone building anything can benefit from VR visualization of their proposed developments. Stakeholders can walk around imagined spaces alongside their architects and builders to experience large developments long before the first shovel has hit the dirt. Allowing for very realistic presence in those environments, and a real sense of what the final product will be. A far cry from the conceptual drawings and physical models that have been the mainstay of visualization in this sector.

Real Estate

Visualization of existing spaces like homes and commercial areas are greatly serviced by VR. High resolution 360 photography as well as video allow for buyers to walk around homes that may be thousands of miles away, to try before they buy. This is really taking off and we will see this evolve into “the” method for home buying over the next few years.

[Tweet “”This is really taking off and we will see this evolve into ‘the’ method for home buying.””]

Music, Sports and Politics

This year we saw major sporting events in VR where we could attend music concerts in the front row and backstage, and we could be on stage with political candidates while they were having a live debate. While all of these experiences had some shortcomings, they were all firsts, and experiments, and all promising of a future where we have control of how we want to experience an event. The future here promises social interaction at events taking place far away. Go see your favorite band from the comfort of your home sitting right next to your friends, dancing and enjoying the show, with virtual tickets to the front row, like having a private sky box for every event.

Health

The health field has several areas that will benefit from VR in the very near term: diagnostic, pain management and fear management. We have yet to see much in terms of diagnostic distraction, but it is a big one for VR. In health, a good distraction might be more effective than a sedative or painkiller. VR has been used to help with chemotherapy sessions, and skin stretching for burn victims, proving as much or more effective than painkiller counterparts. For fear management, the simulation and presence capability of VR allows for very real human reactions providing people to approach their fears in a safe, controlled environment. This year we saw live streamed surgery in VR, in which VR is used to train surgeons with interactive elements. Mindmaze brought us a means to help stroke victims recover in VR. Simulation in VR is also being used to help treat PTSD. The presence and realism allowing for safe environments with which to deal with the trauma of warfare. The health benefits of VR are really just starting to be tapped, and the future here is very bright.

Training

One of the largest umbrellas of VR potential is in training. Simulating scenarios, machines, tools, environments and people, all propel VR ahead of traditional in-person or video based training solutions. While the fidelity of these experiences and quality of the instructional design are subject to vary based on the developer, this is one area that VR can safely say it has a handle on for the future. Whether you want to fix a car, repair a furnace, solve a personnel problem, learn a culture, VR has the ability to engage learners at a level never before possible. The training experiences to date have been lackluster, but there are few who doubt this will evolve into the medium for training and instruction in the future.

So with all of this amazing progress in the VR space, where are the jobs? In the next few years the major jobs will be leveraged from the gaming industry. Many of the skills needed to make VR experiences are already present in video game development. Modeling, rigging, animation, interface design, graphics, real time processing, control and networking are among the core competencies of game development which translate immediately into VR development. Content creation within this space will be an order of magnitude larger than we have seen in the video game industry. So while game developers are a good starting point, the demand for content across all sectors will require a new generation of developers to learn these skills for more than just gaming in order to build the environments needed for training, simulation, gaming, movies, health and every other conceivable VR use case.

[Tweet “”Jobs in this sector will make current Hollywood blockbuster credits look short by comparison.””]

VR Movies

Movies are the other area of massive VR job growth. The skills of Hollywood, from cinematographers and directors, to actors and special effects experts, all of these skills are needed to create compelling stories in VR. There are efforts for direct translation of Hollywood methods, such as those by Uncorporeal, whose aim is to get the 3D filming of actors into CG environments. Their efforts in “Fluffy” and “Alcatraz Island Lofts” show the emotional value of real human actors in VR environments. As experimentation progresses, we will start to see more and more synthesis of traditional Hollywood and their CG counterparts. One company, Baobab Studios, founded by former Pixar and Hollywood talent, put together a great short “Invasion,” showcasing a great seated VR storytelling experience. 20th Century Fox took notice and has invested heavily in them. This is just one example, and there are dozens of studios and teams working on the next generation of VR movies. The jobs in this sector will make the current Hollywood blockbuster credits look short by comparison.

Capturing 360 degree video in 8K resolution per camera is something that is coming. Current Offerings like the GoPro 18 camera rigs, alongside the all-in-one Nokia Ozo are creating a new breed of filmmaker for VR, and the rules are yet to be written on how to film, what to film, or even how to present that to a viewer. These skills will also draw heavily from Hollywood, but require a new lens and a new generation to explore and show us what VR film can be for movies, news, sports and music.

Distributed Development

As all of these areas of VR are explored, there are people all over the world which will soon be capable of working on projects from wherever they are. Collaborative VR development and distributed teams are something many developers are embracing. The Unity Game Engine, for example, has two services in this vein: “Collaborate” and “Opportunity,” which aims to get developers from opposite ends of the globe working on the same project together in the same environment, while also offering up “yelp” style job postings and developer databases to have people bid on freelance projects from anywhere in the world.

We are into our first year of VR, and there have been many stumbles. But those have been overshadowed by the successes, and ultimately the fervor of what is to come. We know there will be better headsets and better experiences in the not too distant future. Anyone who has tried some of the premium offerings in VR already has a good sense of where we are headed.

Chris Wren comes from a background of AAA game development, working as an artist and producer for 12+ years at companies like EA/Maxis, Namco and Microprose. Among the titles he has worked on are the award winning F-16 combat flight simulator Falcon 4.0, the Sims franchise (Chris worked on 7 of them), and Warhammer: Mark of Chaos for the PC and XBOX 360. Chris studied Cognitive Psychology at the University of San Francisco, and earned his Master of Education in instructional technology at George Mason University. Chris has spent the last 8 years instructing game development at George Mason University in Virginia and for over 5 years he has been researching the topics of VR and AR, specifically advanced interactivity in this new emerging medium. Chris also founded and runs a small company focused on game consulting, virtual and augmented reality: WrenAR LLC.