A Detailed Look at LBE Company: Hyperverse
Virtual reality technology is developing today in two diametrically opposed directions. On one hand, the market for individual consumption is growing. VR is coming to ordinary households and, with the democratization in the devices’ pricing, this trend will only continue to gain momentum. On the other hand, the use of VR technologies in the market of location-based entertainment (LBE) is growing at an explosive pace. The so-called “free-roam VR” is moving steadily towards the formation of a new mode of entertainment set to compete with old-fashioned cinemas and escape rooms, to name two examples. The high start-up costs for small and medium-sized enterprises represent a serious barrier to the growth of free-roam VR’s commercial market. This is, to a significant extent, linked to the high costs required for the implementation of the free-roam format’s spatial tracking system. Nonetheless, new technological solutions have been emerging in this market in recent years with the potential to make free-roam VR much more accommodating of big-name franchises.Classic bestselling VR headsets from Oculus and HTC give users the opportunity to move around within the confines of a small room. This is the “room-scale” format, as it’s called, where the built-in tracking can oversee the player within an area of up to 9m2.
Spatial tracking within the Free-roam VR format is implemented across a fundamentally different range (several times the size of a normal room), and possesses the ability to track multiple players simultaneously. To enable the free movement of several players across a wide space, a much more complex system comprised of server hardware, sensors, and markers is required. This system should be able to detect and synchronize, to a high degree of precision, the real-time positioning of all players present.
With roots in the animation and film industries, motion capture technology was, for a long time, the basis of tracking in free-roam VR. Motion capture was created with an eye to speeding up scene creation using computer graphics, so that in place of the painstaking process of designing characters from scratch, real scenes could be filmed in a special studio and transformed into computer animation in real time. Sensors monitored by special cameras are installed on the actor-figures. Data about the actor’s movements is entered into the computer and reduced to a single three-dimensional format, which then becomes the animation. This technology significantly speeds up the production process of scene shooting, and thus reduces the overall cost of creating the film.
Having historically migrated from CGI to the VR industry, the basic principles behind MoCap were preserved to become the crux of the so-called “outside-in tracking.” Sensors are already being installed on ordinary visitors at VR parks, with cameras recording all their movements. This data is transmitted to a server, using the player’s key body parts to construct a “skeleton.”
The problem is that if, given the grand financial scale of the film industry, only large studios can easily afford the use and maintenance of these extremely expensive motion capture systems, then only these large studios will be able to implement such a system of spatial tracking in the VR industry. Cases in point are the Zero Latency and Void projects’ use of the Optitrack camera, which costs at least $1.2 million to install in a single location. If they were to use Optitrack’s competitor system Vicon, the figures would still look stratospheric for SMEs: installing this kind of sensor costs north of $50,000 (but usually at least $100,000). Even the shallowest analysis will indicate that an average sum of $30,000 is required to kit out a small location.This brings us to the crucial fact that although free-roam VR can boast the status of an extremely attractive and promising mode of entertainment, its startup costs still place it beyond the reach of most business operators. As a result, only a select few have the chance to try free-roam VR, and it remains a rarely seen product set up only in entertainment centers of the biggest cities.
This method of developing the free-roam format essentially repeats the life cycle of all large-scale technology: during the first release stages, access to the product is set at exorbitant rates to cover production costs, and only with the appearance of the “budget” option is the product then sold to the mass market.Of course, players in the Free-roam market have made efforts to reduce the cost of implementing costly outside-in tracking without compromising quality. However, without making changes to the basic principles of existing tracking systems, no significant breakthroughs could be achieved. Such a breakthrough was the result of projects proposing to completely abandon the legacy of traditional cinema technologies and develop their own tracking system based on the principle of Inside-out. This principle turns the approach to player-tracking on its head – instead of actively tracking markers on a player’s body through a system of cameras, it uses a system of stationary markers on the room’s ceiling and directs our focus towards a single camera upon the player’s VR-ready backpack.
The Hyperverse project is developing and promoting a product very much of this domain. With its head office in New York and an R&D division in Moscow, the company is currently patenting a solution to inside-out tracking for free-roam VR. We tested their technology at the Amusement Expo International 2018 in Las Vegas, where their team deployed a demo zone in just a few hours. The quality of Hyperverse’s tracking is not inferior to that of Optitrack and has demonstrated competitive results, all while proving much cheaper to implement. So how does it work?
How It Works
As we’ve already outlined, Hyperverse re-wrote all the rules of VR: they replaced all the expensive cameras and other fine-tuned equipment with passive infrastructure. In contrast to traditional methods of installing cameras around the vicinity’s perimeter and placing sensors on the player’s body, the camera is instead placed pointing upwards on top of the player’s backpack, where it is then identified in relation to the passive infrastructure, ie. the “farm” of markers placed on the ceiling. Therefore, instead of following several key points which form the player’s “skeleton” (as in outside-in tracking), the position of the head, body and hand is determined without placing markers on the player. How then, is the rest of the “skeleton” constructed in this manner? The answer is that it’s “built” based on a given model whereby the key points are located in the player’s hands and Oculus Touch controller, the signal from which is then picked up on by the Leap Motion controller. Drawing from locomotion science (the laws of human movement in a given space) and osteology (the study of skeletal structure), we are able to use these three key points to identify the exact positioning of the user’s body parts. At the same time, there are no problems with markers overlapping on the player’s body (a typical issue for Outside-in tracking); knowing where both arms and the tracking point on the user’s back are, the system creates an inch-perfect virtual reproduction of the top half of the user’s body.
For accuracy in determining the player’s position in the space, a system of markers using conventional QR codes is made responsible for the calibration of the cameras on each backpack. Such a system has proven to be much cheaper than installing expensive cameras with motion capture software. The markers are extremely low-cost in production, and the calibration (mapping) is only required upon first installment. Hyperverse tracking is closely aligned to software solutions: the magic is in the special information processing algorithm within the established system of fully-immersive devices. Making use of the camera, the in-game backpack determines its location relative to the passive marker, and using the additional controllers receives information on constructing the position of the body and hands. The calculated results are visualized first for the player, and are then transferred back to the server in the form of a simple reading of coordinates and movements. The server then instantly shares information with other backpacks participating in the multi-player game. The lag between one player’s completed action and the visualization of this action on another player’s device does not exceed 30 milliseconds, a gap practically invisible to the human eye. It is also worth noting that in inside-out tracking, 95% of the data processing takes place directly inside the backpack, which significantly reduces loading time for the in-play software and communication channels.
Amidst the further development of inside-out tracking, we envisage two possible progress paths for Free-roam VR:
Path 1: Full release on the mass market. Appropriate funding for technical set-up would enable SMEs to launch gaming locations with minimal investment. As a result, free-roam VR will begin to appear on a wide scale in local markets via regional entertainment parks, filling a “premium” niche previously occupied by lower-quality “arcade” room-scale VR attractions, while also becoming a competitive alternative to the cinema and escape-room sectors.
Path 2: A more long-term, ambitious plan whereby free-roam VR moves towards increasing the number of equipped locations and the number of players taking part in multi-player games. Since the time it takes to load and transmit data in the in-game computers inside the VR-ready backpacks is minimal, dozens of other concurrent players can be brought into the same game in one location, and the tracking system can be fundamentally scaled by a mere increase in the number of markers in use, this may soon lead to attempts to bring free-roam to the eSports industry. Should this happen, it would be reasonable to expect a rebirth and renewed boom in this market.
So far, Hyperverse has entered the market with its standard small-scale setup of a 900 square foot location and a six-player multiplayer game. Having completed testing stages with a small number of multiplayer sessions, the project has its sights set on testing large-scale location set up.
We may well be on the cusp of the next revolution in entertainment.