IBC 2017: The future of animation, video, AR and VR


Digital Arts

We look back over the IBC 2017 show in Amsterdam – where groundbreaking new tech for producing motion media will brings us more realistic VR and AR, purposefully less realistic animation and a (thankfully virtual) exploding chicken.

IBC turns 50 this year, and in that time the International Broadcast Conference – and giant trade show – has obviously seen a lot of changes. In 1967, TV and cinema were the only way to get visual motion media in front of an audience – but the internet has increased not only the number of platforms the content you create can live on, but the mediums in which you can produce it. And this has accelerated over time – even a visitor who last explored the multiple enormous halls of the Amsterdam RAI just five years ago would see a huge shift in the technology and type of content on display.

Now you’ll see Facebook and Amazon next to the old guard of the BBC and companies that own satellites. New high-end cameras from Canon and Sony sit next to an upgraded version of Blackmagic Da Vinci Resolve – 14, as you asked. Resolve is a high-end colour grading tool, but it’s free to download unless you want to work collaboratively. This puts it in the hands of the YouTube generation, doing-it-themselves until they've formed a studio and have the budget to pay for software.

But what’s coming next – in both the immediate future and longer term? The industry clearly thinks it’s VR and AR. Everywhere at IBC 2017 – scattered among the blinking boxes, studio lights, audio mixers and HDR displays – were headsets.

It’s no mystery about VR

In the self-styled Future Zone, it was interesting to see a crowd of people gathered around a pair of people wearing headsets and using a contraption that evoked Edward Scissorhands.

This was Touch Engine, a VR simulation platform from Toia, which uses haptic feedback on customised tools (including dentist drills and robotic fingers) to enhance the learning experience for dentists and surgeons. There’s been a lot of exploration around the potential of VR in education, but this platform, which is available as developer edition (a consumer version is on its way), offers something different.

“There are a lot of VR training simulations out there, and a lot of [single use] haptic devices, but they are just offering one simulation and one tool,” said Richard Piskorz, blessed with the title of unreal developer at Toia’s parent company Generic Robotics.

“With this you get a major piece of custom hardware, running multiple simulations, and with multiple toolkits. When you finish one procedure, you just replace the tools. We currently use Unreal Engine 4, but will be moving to Unity next year.”

Already sold to a number of universities in the UK, Toia isn’t just restricted to medical and surgical training, with applications such as engineering, entertainment and creative arts being explored.

VR at scale

Also in the FutureZone was Igloo Vision, projecting 8K video in a 360° format for editing, monitoring or experiential uses inside a dome, while the Time Tunnel, a 6m x 2.5m enclosed walkway created purely from LED by SmartAV, displayed a host of beautiful and sometimes eye-searing content on all sides as visitors passed through it.


Image: Igloo Vision

The Time Tunnel

Working within an area defined by Microsoft Kinect devices was DoubleMe, a Silicon Valley startup, with a research lab based at Ravensbourne University in Greenwich, London. DoubleMe’s 3D capture system, the HoloPortal, captures motion from moving subjects, or still subjects such as furniture, then generates a fully animatable 3D model. The 3D models contain volumetric 3D mesh, movement, and textures (skin, facial expressions, and clothing), for virtual, augmented and mixed reality experiences.

James E Marks, SVP and head of the UK operation, said the concept of HoloPortal was influenced by sci-fi – namely Princess Leia’s hologram in the original Star Wars, and the interactive interfaces in Minority Report or Iron Man – and indeed when we tried on the HoloLens we could see a holographic ballet dancer directly in view, ready for interaction.

“I can create a whole creative installation for you in HoloPortal and we can share it in real time,” said James. “Or imagine if you wanted to learn ballet - we had some people trying this out here the other day - this system captures you in real-time, so you can see yourself in the HoloLens as a very lo-fi version [dancing with the CG model].

“Everyone has been approaching us asking what are we selling, but we just want this to be a free social, open platform that might evolve like YouTube,” he said. “We’ll provide the understanding of the technology, and we’ll have our proprietary algorithm, but we’ll give users a free plugin that could work with anything - a game, film, whatever. But we perceive it won’t be the traditional things.”

Marks said DoubleMe said that its offerings will be used as a new social artform, placed somewhere between Snapchat and YouTube, using an immersive mixed reality device.

“We’re probably the most unlikely people to be here,” added Marks. “People are still trying to flog us 8K TVs, which maybe the younger generation will never even be interested in. With [our] technology, you could have the biggest screen in the world, and still be engaged with your mate, and still be on Twitter.”

Tools for the trade

Elsewhere at IBC, Happy Finish demonstrated some of the London-based content studio’s 360-video/VR work on the Adobe stand. I managed to catch interactive producer Amy Tinker and capture lead/director Jamie Mossahebi before they went on to show work such as the 360° film for the Tate’s Georgia O’Keefe exhibition, marketing VRs for Ted Baker and Ford, and The View from The Shard VR experience, where you can hurtle around the outside of London’s tallest building, on a simulated rollercoaster of course.

When asked what had made the biggest difference to their workflow in the past year, Amy replied: “There’s been a change to the cameras on the market – the quality’s always improving. As well as different stitching software – it’s become a lot faster.”

“We’ve been using tools from Adobe and The Foundry, as well as using Mistika a little bit – it’s very fast, very effective,” added Jamie.

“We’re developing a lot more AR-based stuff, and there’s definitely been a bit of a push around event-based activations,” he added. “Clients might want 300 headsets in a room, all ready to play at once, or to have people walk into a dome, or take part in an interactive platform that goes with a VR experience like our Shard project. People are wanting appointment-based VR rather than distribution online. That is still [valid], but there’s also more of a move towards the concept of VR arcades. Some are already popping up in Asia.”

So if you wanted to get in at the ground level of the next project like The View from The Shard, Jamie said Happy Finish are looking for you, if you are a multi-skilled creative that is. “If you are a compositor, you should start thinking about things like CG game engines, and if you’re in CG you should be getting into compositing,” he explained.

“Like most small businesses, you need to be able to take projects from start to finish,” added Tinker. “When you have people working for you who have an understanding of pre-production and how something needs to be shot to get the result you want in post-production, it definitely pays to have all of those skills – you just get better results.”

Networking while not working

Networking at IBC doesn’t just take place on the show floor. A prime example is Vinnie Hobbs, encountered at a party. This editor and owner of VH Post chatted about his approach to videos such as Kendrick Lamar’s Alright (below) and Anaconda for Nicki Minaj, as well as commercials, and his feature films such as Licks, directed by Jonathan Singer-Vine.

“I use Premiere Pro for all my biggest videos,” he explained. “It makes everything very quick – you don’t need to transcode and you can just drop everything raw in there. I think that really helps create tunnel vision; you don’t get distracted by any technical roadblocks, and just focus on the cut.”

However the conversation was not just about the tools. “You don’t want to be an editing software operator, you want to actually have some creativity behind each cut you make,” said Hobbs. “That’s very important for each artist that you work with. I look at them and say hey, maybe they need a certain kind of approach for each edit. Maybe a more mature cut, or maybe a faster cut for the younger generation. It all depends on where they’re at, at the moment. And I think that can really translate and have a long lasting impact on their career.”

Software on show

Adobe itself was attracting a lot of attention with previews of its forthcoming Creative Cloud video tools, and a webcam setup with Character Animator, which was driving cartoon characters in real time, based on captured motion from stand visitors. Read: Adobe reveals details of next versions of After Effects and Premiere Pro

The upcoming release of the animation tool, finally coming out of beta, was being demonstrated by Adobe product manager for motion graphics and visual effects, Victoria Nece. “It doesn’t just feel like an experiment,” said Nece. “It’s a real production tool now.

“The new visual control panel shows everything that’s built into your character that’s able to be triggered. You can hook it up to a MIDI keyboard, or dials and knobs. You can use an iPad app that allows MIDI out, where you can build your own control interface, so you can create a whole control system.”

The triggers panel can also run an image sequence, demonstrated by Victoria  with an amusing exploding chicken character (don’t worry, it’ll probably ship with the new release). New collision physics has been added, so characters can be thrown around in real time, subject to other dynamic behaviours. For more complex models, you might not want to use such ragdoll physics, according to Nece: “You can do some really subtle effects - there’s hair that can be affected by gravity, or there’s wind, so you can make leaves tumble or things flow around. Someone made a really nice cape for a superhero character. They pinned the cape to the shoulders, and had the wind blowing it. It’s all live, and all the triggers can nest within each other, so you can build incredibly complex things.”

New tools for VR VFX

As usual, the Fraunhofer Institutes of Germany were showing a number of audio and visual innovations at IBC, including a new lightfield plug-in from Fraunhofer ISS that lets you determine the depth and lighting of captured footage (below).

Available for licensing to professional users, Fraunhofer ISS hopes the plug-in will be incorporated in post tools like Nuke for the creation of photorealistic VR content. It enables work with lightfield or multi camera data for effects like refocusing, virtual camera movements, and relighting of scenes. Also new from Fraunhofer is easyDCP Publisher, an all-in-one software solution for the generation and playback of digital cinema packages (used for submitting films to festivals for example). This is a ‘lean’ version of the full easyDCP suite, offering a project-based license model, so small studios and solo filmmakers should take note.

Fast stitching for VR

Back on VR, SGO was showing off the realtime stitching and optical flow capabilities of its subscription based software Mistika VR, with new functionality to stitch Stereo 3D launched for the show. Also on show was free education software – Mistika Insight - and a preview of VFX/compositor Mistika FX, previously known as Mamba, working as part of an integrated workflow with Mistika VR.

“We’ve solved one of the biggest problems in VR, which is fast stitching,” said MD Geoff Mills. “Our workflow means that we can work on stabilisation and do some VFX work on the stitched VR environment, but without rendering. The stitching is still live, as we’re flowing it from Mistika VR into Mistika FX, and we can flow that into [turnkey solution] Mistika Ultimate, with the stitch still live. That has got people hugely excited, because as soon as you bake in the stitch, you have nowhere to go if you need to tweak it for the finishing.”


“With Mistika Insight we’ve launched a free training version of all the software you get with Mistika Ultimate, and we’re giving away all the training collateral that goes with it,” he continued. “For Mistika VR, we’re getting all sorts of customers from kids in their bedrooms, to major post production studios. Some of the people who are using our software are not the traditional TV and film customers. We’ve been talking to a completely new group of people, who have been doing some really cool stuff. We didn’t know they existed - but of course we do now.”

Comments