4K Vision: Inside The BBC's Journey To Making TV In Ultra-HD

By James O Malley on at

Over the summer, you might have spotted that the BBC has been quietly dipping its toe into Ultra-HD (UHD) waters. Following the successful iPlayer debut of Blue Planet II in UHD last Christmas, the corporation has more recently been offering the World Cup and Wimbledon’s Centre Court in UHD too - and according to recently released figures these events together totally 1.6m requests for UHD live streams on iPlayer.

The pictures, of course, look absolutely stunning: the images are sharp and detailed, thanks to the massive resolution, and the colours look incredible in High Dynamic Range (HDR). Once you’ve watched a game popping off the screen in HDR at 50 frames per second, tuning into regular broadcast coverage looks washed out and almost as old fashioned as watching an elephant poo all over the Blue Peter studio in black and white.

What might be surprising then is how the technology is still not particularly mature. Even though many of us - particularly early adopting Gizmodo UK readers - are likely to have UHD/HDR capable TVs in our living rooms, broadcasters are still learning how to make stuff we can watch on our expensive new toys.

Just how difficult is it? Brilliantly, the BBC invited me to see their UHD setup at Wimbledon to find out.

The Workflow

To understand why live UHD is hard to make, you need to understand a little bit about how TV is made.
Think about how the pictures from Wimbledon get to your TV. There’s a bunch of cameras around the court, each of which can be tuned to shoot the type of pictures that the director wants, and there’s actually something of an art to it. Each camera by each manufacturer will have subtle differences in colours and the like - it’s like how you can shoot the same thing on both an iPhone and a Samsung and the two images will still look somehow different. If the court is surrounded by dozens of cameras, to make sure it doesn’t look weird on TV when the shot is constantly switching to different views, all of the cameras need to be carefully calibrated - or ‘racked’ - to look the same. If it works, we viewers don’t even notice this complexity.

Next, these images are sent to a room - or in Wimbledon’s case, a truck parked behind the staff canteen - containing a bank of screens and hundreds of buttons and knobs that the director and vision mixers can use to pick which shot goes out live on TV. There’s obviously a lot more complexity – in terms of adding the graphics, getting the pictures to the broadcaster and so on – but these are the basics.

Simply getting this right is significantly more difficult for live programming than something that is prerecorded: With, say, an episode of Doctor Who, the director can spend hours in the editing suite tweaking each shot to make sure it looks perfect, adjusting the levels and trying different things out. But live sports don’t have that luxury - they need to look great live.

In Wimbledon’s case, this means there’s an entire team of people tweaking the pictures in real time - after all, conditions are not static, as during any given match the sun and clouds are moving, and the weather is constantly changing.

A big team makes real time tiny adjustments.

Sure, the cameras could instead be given fixed offsets - but this would perhaps be like comparing a standard one-size-fits-all Instagram filter on an image, to having a full version of Photoshop that can tweak an image. Instagram might look good some of the time - but if you’re a real professional, you need more control.

In other words: There are hundreds of things that go into making TV that you’ve never even thought about before.

Increasing Complexity

UHD and HDR makes things even more complicated. Here’s the problem: When you’re shooting in UHD with HDR, to get the best quality and most detailed pictures, cameras will need to be racked to optimise for the best contrast, colours and exposure. But what works best for UHD with HDR is vastly different to what normal HD needs.

To further complicate this, automatically down-converting the UHD/HDR images to HD is a non-starter. If you imagine the range of colours in an HDR image, the software converting the images to normal HD would have to choose how to convert the HDR range to a much more limited HD palette. And with current technology, there isn’t a smart way of doing this without making big compromises on the normal HD images. The software would have to be very conservative in its choices in order to make make sure it works for every shot fed in - but this will result in some bland looking images, which won’t go down with the overwhelming majority of viewers watching the normal broadcast.

So how can the BBC deliver the best pictures for both UHD viewers, and normal HD viewers at the same time? This is the big question at the heart of the UHD trial.

This isn't like Wii Sports at all.

One possible solution could be to double everything up: Use a completely separate set of cameras, vision mixers, and broadcasting equipment - essentially producing two parallel TV shows instead. But obviously this is wildly impractical: It would be super expensive and require dozens of extra people.

Instead, the focus of the trial is to learn how to make TV the way it has always been made - with one workflow producing content that will work and look good in both UHD/HDR and normal HD. And the way the BBC has figured out how to do this is interesting: the solution is to have the team of people adjusting the cameras in real time for HD, and then having the resulting feed sent to a small box known as a “Master Setup Unit”. Here, a master of all UHD sits, and adjusting the SDR-optimised images in real time for the UHD stream, by applying their tweaks on top by watching the action on a specialist £30,000 Sony monitor (and you thought that your TV was expensive).

The BBC staffer who controlled this was able to show me a live comparison of what he was doing, and it was amazing to see just how much more detail can be extracted from the images by adjusting a few settings. In one SDR shot, the crowd in the stands was basically a mass of dark green, as they were hidden under the canopy. The sky was just a bit white slab. But with a tiny HDR tweak, suddenly individual features of people in the crowd became visible, and clouds on top of a blue sky appeared.

In the longer term, the intention is to essentially flip this process around, and have the UHD/HDR pictures take priority in the workflow, and then have those down-converted to normal HD. But hey, this is very definitely still early days.


The other big part of the various UHD trials involves us. The Beeb has been using the World Cup and Wimbledon to figure out how to broadcast UHD/HDR to us at home - and has been learning how to make the iPlayer handle it properly. The biggest tool in its arsenal for doing this is a standard it has co-created with the Japanese broadcaster NHK called HLG - or Hybrid Log Gamma.

Basically, HLG is a standard by which High Dynamic Range instructions for your TV can be transmitted live. It enables the BBC to only have to send out one stream of pictures that has the instructions for both HDR and Standard Dynamic Range (SDR) embedded into it. This means that the broadcaster doesn’t have to pump out multiple streams, instead, it can rely on the receiving device to do the hard work in deciding whether to simply decide to show SDR, or use the full HDR gamut of colours.

Another big challenge for live broadcasting in UHD is one of encoding. You might already know that if you want to watch UHD, you already need a fast internet connection because of the size of the files that you need to be able to download in real time. But what you might not have realised that for UHD to work live, it requires even more bandwidth than pre-recorded content.

Why? It’s basically a question of how efficiently the BBC can squash down the data. With a pre-recorded UHD show - say, Blue Planet II, which was available on iPlayer last Christmas, BBC engineers were able to spend hours processing the files to optimise them down to work on a bitrate of 22mbps. For live content however, engineers don’t have the luxury of time - they know that viewers want to see the on-court action as close to real time as humanly possible, so the computers that crunch data are only able to get the pictures down to a bitrate of 36mbps. Could the BBC have got Wimbledon down to a lower bitrate, so UHD would work on slower connections? Maybe, but you would have had to wait 10 hours to watch it, one engineer explained to me.

And finally, there’s the (not entirely unrelated) issue of latency: Even though it should be considered amazing that we have we have the technology to see every bead of sweat on someone’s forehead from the other side of the world in near real time, there have still been some grumbles. During the World Cup, one common complaint from people watching the UHD stream was that because it was a minute or so behind normal TV broadcasts, when a goal was scored you would hear your neighbours cheering before seeing the goal for yourself. But thanks to the trials at both the World Cup and Wimbledon, this is already something the BBC has managed to improve: Over the course of the two weeks of the Wimbledon tournament, engineers were able to reduce the latency by around 20 seconds, by looking at what elements went into the broadcasting chain and optimising them. Meaning that by the time of the finals, the UHD action was only around 50 seconds behind broadcast. No doubt this will improve further as trials continue.

And that is just about where the BBC is right now. Unfortunately, the people I spoke to remained tight-lipped about what the next UHD/HDR trial might be, but certainly after the World Cup and Wimbledon, it appears that the Beeb are already learning a lot.

James O’Malley is Interim Editor of Gizmodo UK and tweets as @Psythor.