IBC’s haul of gadgets and prototypes usually prove so eclectic they may be unclassifiable if you were to meet them on the exhibition floor. This year, though, all things Virtual Reality have cornered the Future Zone.
There’s a claimed world's first solution to remotely moving and stabilising 360-degree cameras in the form of the 360° Evo from Motion Impossible. The 360° Evo is said to be almost invisible in the 360-degree spherical image and uses gyro stabilisation to eliminate roll, tilt and vibration. It can carry up to 5kg and attaches to the V-Con XL, a vertical axis stabiliser. Marry the 360° Evo to a Mantis dolly – also from the British developer — and you have a solution to remotely moving and stabilising 360-degree and VR cameras on the ground, albeit costing £13,845.
The audio portion of a VR experience is as important as the visuals in helping viewers navigate and storyteller's narrate which is why b-com has devised a means to apply High Order Ambisonic audio to VR content.
There are a number of VR experiences to try too. How about a rollercoaster sim? Brazilian software developers Rilix designed the original 'Rilix Coaster' for theme parks and has now adapted it for Samsung GearVR. Thrill seekers can ride aboard a rollercoaster cart travelling through 18 scenes from blizzards to beaches via topical mountains and sci-fi backdrops with an apparent sensation of speeds up to 140kph.
Aardman Animations' We Wait is an animated take on a migrant’s perilous sea crossing from Turkey to Greece, created from BBC News footage. It’s on the BBC R&D stand along with fantasy VR film The Turning Forest and a VR walk through of the EastEnders pub The Queen Vic.
Perhaps the best VR experience in the Future Zone though is a 15-minute spacewalk set 250 miles above the earth. Created in Unreal Engine 4 and delivered on HTC Vive by producers Rewind for the BBC, Home – A VR Spacewalk is inspired by NASA training programmes and can be experienced using a haptic feedback chair. It’s a narrative - in that the spacewalk does not go according to plan, much like in the film Gravity. A full body biometric system allows the ’astronaut’ to hear and monitor their own heartbeat during the mission.
Stitching video in software from multiple cameras or sensors to create a 360-degree movie on-the-fly is one of the main bugbears of current VR workflows. Particularly for live streaming the software process can be slow and erratic at best in aligning parallax. A demonstration by Argon360 shows stitching happening entirely on a hardware chip in the camera rig itself. Video from four individual sensors is being passed to image signal processing blocks then stitched into a single output by the patent pending technology and further processed on the chip for viewing, streaming or storage.
Realtime stitched streams are also viewable in the Future Zone on a spherical interactive display from Pufferfish, a start up formed by two undergrads at Edinburgh University. Sphericam's V2 is a tennis ball sized rig comprising 6 lenses which records 4K (streams it over WiFi too) and claims to output completely synchronised capture of every pixel in every direction.
Nokia has gone for a software approach which it describes as a “broadcast-quality” live VR streaming solution. The OZO Live software, which launches soon, works with standard 4K kit over SDI, it processes the raw signal from the camera in real time, stitches the panoramic images and outputs monoscopic or stereoscopic video.
Nokia has the OZO itself on display. The space-age style black sphere contains eight microphones and eight sensors each with a 195-degree field of view per lens. The audio is software rendered by to follow headset orientation. Each interchangeable digital cartridge provides 45 minutes of record time, and saves all media to a single file – rather than a handful of SD cards. The video sensor contains a progressive scan with a global shutter.
Some argue that no matter how good the algorithms, no software can ever stitch a set of images from disparate lenses perfectly together. That’s why the lightfield approach to VR is so interesting. Optical developer Lytro is not in the Future Zone but it does have a presence at IBC and has just teased footage shot and post processed with its massive orb-shaped Immerge VR system.
Lightfield photography is different from traditional photography because the cameras can measure the geometry of the light that strikes the image sensor — instead of just capturing it straight on. With enough computing power, (and Lytro’s system uses an entire portable server farm, will have back up in the cloud and still uses compression) the software can reconstruct the scene and give headset users what is referred to as "six degrees of freedom," or the ability to move side to side, up and down, and forward and back. What that means effectively is that you’ll be able to move your head and change your perspective within the 360-degree video.