By Russell Johnson, director, Hitomi
Published by SVG Europe Wednesday, January 6, 2021
Sports broadcasting has driven technology for most of the history of television. The needs of the sport audience have forced developments, from large numbers of cameras to super slo-mo replays and from real time graphics to immersive audio.
If you were to take a poll of sports fans, the one thing they insist upon above all else is that the coverage must be live. You watch not knowing what is going to happen in the next instant; psychologically, any delay breaks the spell for the viewer.
A part of this sensation of live transmission is that everything has to be synchronised. There cannot be visible jumps, say between cabled and wireless cameras. The sound has to be perfectly synchronised to the video: if the sound of the ball being kicked does not come at the exact moment the boot hits the leather, it is immediately obvious to the viewer.
To make all this happen, you need a relatively large production team – director and vision mixer; graphics and replay operators; links engineers and audio supervisors. You sit them all in a large truck on site, communicating over very busy talkback to the camera operators and microphone trackers around the ground. They can see each other; they can hear each other; (for a good crew, at least) they know what each other is thinking.
That all fell apart in 2020. COVID-19 and its social distancing meant that, even when sport was up and running again, the idea of production teams sitting up-close and personal with each other was not going to happen. Audiences, though, are used to very high-quality sports broadcasting and will not be forgiving of a drop in standards, even in these very difficult times.
Remote production for sports was an idea well in development. There are plenty of excellent examples – from ice hockey in Scandinavia to college football in California – which use remote production, in the sense of having the cameras and microphones on location and the production team, the switchers and the graphics back at base.
What COVID-19 has demanded is that the next stage of remote production, with the production team even more widely spread and some working from home, has to make the jump from science project to everyday reality. It has to do so without compromising what makes television sport great. That means no loss of production values, no loss of quality and no loss of immediacy.
Providing remote functionality depends upon at least some encode and decode cycles, each of which adds latency. If you are using the public internet for some of the links, this will add more and quite possibly introduce some non-deterministic delays. If audio travels over different paths, then as well as synchronisation issues, there is the possibility of phase differences which can ruin the sound.
The system engineer has to design a system which minimises latencies as far as possible. That might include routing in parallel rather than in series from operator to operator.
Any remote production system, however well optimised, will inevitably include latency. So, you need to be able to measure the latency at each critical point in the chain and to ensure that the final delivery is in synchronisation, as close as possible to real time.
There are some good, well-established standards to help us with synchronisation. BLITs – Black & Lane’s Identification Tones, devised by two Sky engineers – identify channels in surround sound, as well as allow for precise alignment. The latest technology achieves audio/video synchronisation of better than 1ms and a phase correlation between audio channels of 0.01 of an audio sample.
The next generation of tools will be able to automatically measure end-to-end latencies and precisely align the output of a live broadcast, however convoluted the remote production path. Necessity being the mother of invention, that next generation will be with us before we are allowed back into outside broadcast trucks without masks.
Get the latest updates in your email box automatically.