An open source ecosystem for VR
When you think of taking a technology to more people, it is not really possible without an open source ecosystem. Open Source Virtual Reality (OSVR) is an open source ecosystem that aims to unify VR and AR technologies, so that headsets, accessories and apps can be used across platforms. The OSVR software platform drives this kind of end-to-end compatibility through a single VR ecosystem. It hopes to overcome the problems that developers face due to hardware and platform fragmentation. It aims to create the perfect scenario wherein developers can focus on their work without worrying about hardware or content support. OSVR is backed by big names like Leap Motion, Intel and Gearbox Software.
Razer recently launched an OSVR Hacker Development Kit, which includes the complete designs and schematics required to build your own VR headset. The design is completely open and can be expanded, modified or upgraded. You can, for example, upgrade the faceplate or add a position tracker later to improve on the equipment as and when you wish. This serves all the purposes of sustainability and accessibility by enabling reuse of components and cross-platform compatibility.
Of what use is an affordable headset
However affordable it may be, a headset is worthless if enough VR content is not made available to the masses. It will not be wrong to say that the real VR revolution has been on the content side. In the recent past, there has been a splurge in the number of VR apps available on Google Play and Apple App Store, as well as the number of 3D video channels on YouTube. The most pleasant surprise, however, is the VR content uploaded by The New York Times. In their words, they put you in the centre of the stories only they can tell!
All this is possible thanks to the VR development platforms and 360-degree cameras available today for VR content development. Google’s Jump, for example, is an entire ecosystem for VR content development. It consists of a camera rig, software that assembles the footage seamlessly together and a player.
Google, in association with GoPro, has designed and developed a 360-degree, 16-camera rig for this purpose. However, it is not necessary that only the GoPro camera will work with the Jump ecosystem. Theoretically, you can use any camera. Or, you can build your own camera rig using Google’s specifications, which cover everything from the size and shape of the rig to the placement of cameras and their FOV.
As the next step, the back-end software converts the 16 raw video feeds from the cameras into VR video in stereoscopic 3D by combining computational photography and computer vision. Basically, it recreates the scene by seamlessly combining the camera footage with thousands of computer-generated, intermediate viewpoints. As for the player, Google simply added VR content as an option on YouTube. What better way for it to reach millions of people every day!
Facebook has also designed a high-end video capture system called Surround 360, the reference designs of which will soon be released as an open source project. Surround 360 comprises a 17-camera array, with 14 wide-angle cameras fixed on the flying-saucer-shaped surface of the camera, one fish-eye camera on top and two more on the bottom. This allows the device to capture the surroundings without being disturbed by the pole that holds it up.
Another innovation, according to the company, is that, cameras use a global shutter instead of a rolling one, which ensures the resulting footage does not display artefacts from the closing of individual shutters. The system also includes Web based software that captures and renders the images automatically.
There are several other premium 360-degree cameras such as Nokia’s Ozo and Jaunt’s Jaunt ONE. Cost-effective options are also available from companies like Nikon and GoPro.
If you want to take a really cheap shot at VR content, try something like Poppy, which is a purely optical device priced at around US$ 30, which works with iPhones. Poppy uses mirrors to capture two stereographic images using your iPhone’s camera. When you look through the viewfinder, Poppy lenses stitch the two images together into a single 3D view.