Next Gen Immersive Video Platform
By
Anne McKinnon

@theboolean

Live Planet

Complexity is one of the greatest issues the VR industry faces in working toward the mainstream adoption of immersive technology. Halsey Minor, CEO of Live Planet, proposes a solution to this complexity with their end-to-end stereoscopic and 360 video platform.

Cover image by the Miro Shot Collective
Cover image by the Miro Shot Collective

Optics and Empathy

The solution for this end-to-end system started with optics for the Live Planet VR camera. Minor’s team tested for the range of cameras that would create the least amount of warping for video, and compressed this into one tiny camera with 16 sensors. “The whole point of the camera and all the engineering that’s been done is what we call natural VR, which is perfectly stitched stereoscopic* running in real time,” said Minor.

Their goal: to capture the planet in real-time on camera in a way so that you can drop in anywhere, anytime. “If I think of all the mediums, I don’t think there’s a medium that has ever had the impact that this one will have on global understanding,” said Minor. The ability to drop in on cataclysmic events that you currently just watch as a bystander on TV, or to drop in on sunsets over the Western Cape in South Africa, captures moments that turn this medium into an empathy machine.

It will start with this way of traveling the world in VR that will break down the barriers that we currently face by living in our corner of the world, said Minor, referencing a quote by Mark Twain:

Getting Technical

During a demo with the Live Planet team at Digital Entertainment World, the camera was set-up and prepped for live streaming within 15 minutes. I had a headset on and was looking at myself, in real time, in VR. Taking the headset off, RJ Wafer, CRO at Live Planet, walked me through the live editing system, adjusting numerous settings such as color grading, exposure, saturation, curves and more, on the live feed, all in a user interface designed to make the production process as simple as possible (and for my motion designer community readers out there, there is a Unity integration).

The idea is that if you have 30 of these cameras set up around the world “you can control them from anywhere via the Live Planet VR cloud,” said Minor. Streaming and encoding is also built into the physical camera itself, in a format that can be stored on the cloud and uploaded directly to different platforms such as YouTube, Samsung Gear VR, Oculus, and Google Daydream, without leaving the Live Planet ecosystem. Stereoscopic is integrated into the video camera itself, along with automatic stitching, which reduces the cost of end-to-end editing for VR video by about 70%, said Minor.

Essentially, Live Planet is not just building a live streaming VR camera, they have created an ecosystem for VR to exist on. It is a centralized platform for content creation and distribution, with decentralized cloud storage that runs at a lower cost than competitors like AWS and Google Cloud.

Make It Rain: The Cloud

Minor was an early investor in Salesforce among other companies (CNET, Google Voice, OpenDNS, Vignette), and compares where Live Planet is now to the structure that enabled Salesforce to be where it is today. At first, Salesforce was a basic app, and then they built it into a CRM app or a complete platform for building all sorts of software applications for enterprise.

To do this with Live Planet, they’d need a way to stream and store content using their own system. Servers can be expensive. However, why not do the same with servers as Airbnb and Uber have done, but for computation? On Minor’s suggestion, I searched zombie servers, to find that a very high percentage of servers are actually “dead” or have seen no traffic in the past 6 months.

The only way to use pre-existing servers like this is to have software that runs on these computers to test capacity, measure what they actually do, and have a built-in payment system for server space they provide. This is essentially what bitcoin miners do, but Live Planet will use their own currency called “VideoCoin.”

“We’re taking the same basic idea and moving it over to what we call video miners. We believe that by tapping into all of these computers that are today not being used we can reduce the cost by 60% or even 80%, depending on your current deal with AWS,” said Minor. VideoCoin has just launched in private alpha, and will become available in beta this summer. “We actually turn the fabric of the internet into a video processing utility,” said Minor.

New Models for Monetization

On the Danny In The Valley Podcast episode “People used to pay for things,” with Patreon’s Jack Conte, Danny and Conte discuss how Patreon was conceived by the notion that streamers are underpaid by the current revenue model, and that micro-payments can change the way the internet works. As Conte says, a platform that enables these payments and facilitation of content storage and distribution will spark a digital era that recognizes the value, and therefore the rise of content creators.

In addition to a cost effective cloud, Live Planet has a monetization model in place for streamers. “We’ll have pay-per-view, and we’ve already built in the payment system. You’ll also have subscription. You put a bunch of content in your world and then you can charge for it,” said Minor.

Pay-per-view
Pay-per-view

Pay-per-view

So will Live Planet become the next Amazon for end-to-end VR? It will come down to the company’s ability to communicate the value of their platform, and to provide all the necessary tools at a competitive level in house. At the time of interview, the Live Planet team acknowledged a community of 3k beta users. I’m interested in their feedback on the ROI they see after switching to the Live Planet ecosystem. Of course, the more people who use the system, the more valuable it will become. It’s early days yet.

Much More Music

There’s not doubt that music is a key component in immersive experiences. For example, the music rhythm game Beat Sabre hit top charts on Steam as not just the highest rated VR game, but also as the highest rated game out of all titles. Then there’s TheWaveVR, NextVR and MelodyVR as startups all looking to be the next platform that enables a new way of listening to and experiencing music.

“Where we’re focusing all of our energy is around live music,” said Minor. And he’s right to pay attention to this medium. While its format has stayed much the same, music is leading the way in new types of digital commerce.

Essentially, play good music by itself, and people will love it. Play a great video with no music, and the emotion brought to scenes though the characters will seem misplaced without the suspenseful riffs, or mournful chords. As I recently wrote about in the Culture, Art and Technology series, audiovisuals allow people to go to that “other place,” or to have a shared emotional journey that has a lasting impression. Along with music “when you add the visual presence of being there, to me, it’s the most immersive thing you can do,” said Minor.

Image by the Miro Shot Collective
Image by the Miro Shot Collective

Veni, Vidi, Vici

When Amazon was also taking its first steps, Minor used to work with Jeff Bezos: “With Jeff, the big thing was, we would always say their margin is our opportunity. He now has created an opportunity,” said Minor. Although even without chipping into Amazon’s market, video is growing at 25% per annum, said Minor, and there’s the opportunity for Live Planet to grow into a very large organization.

Despite currently being so complex, there’s no doubt that VR is catching on as one of the most effective mediums out there. In instants it can make people gasp, laugh, cry, scream and wonder. However, with this consumer perception of a complex system, Live Planet will have to work hard to change this intermediate standing of VR to become the global platform it envisions. That is, if betting on 360 video is a winner.

 

*Stereoscopic: stereoscopy is a technique for creating the illusion of depth in an image. In most cases, this means providing two slightly different images independently to the left and right eyes of the viewer that gives the perception of 3D depth, a critical step for the plausibility of any VR environment. “You get the right type of depth and IPD (inter-pupilatory distance) and it’s the difference between vomiting and not,” said Minor.* 

*Motion sickness in VR: is a result of the visually-induced perception of motion, when in fact there is no movement, and disorientation occurs.