It started with a simple observation: Fox plans to upscale a 1080p HDR feed to 4K for the Super Bowl LIX broadcast. The comment came from Dan Rayburn, who shared details about Fox’s planned encoding ladder, including a maximum bitrate of 15 Mbps for 4K delivery. This revelation prompted a lively LinkedIn discussion among industry experts, exploring the decision’s technical, practical, and perceptual implications.
Here’s a deeper dive into the workflow, the questions it raises, and the broader implications for live broadcasting.
Contents
Is Native 4K Necessary?
“SDI router cross points are the issue here. 3G is the best they can do for large-scale live productions. That means 1080p60 HDR.”
— Derek Prestegard, Video Engineer
Fox’s decision to upscale 1080p HDR rather than deliver native 4K content reflects current infrastructure constraints. As Derek Prestegard highlighted, large-scale productions often rely on 3G-SDI routers, which cap workflows at 1080p60. While this limitation is rooted in legacy infrastructure, Mark Kogan pointed out that IP-based workflows like SMPTE 2110 are gaining traction and could support native 4K in the future.
One commentator, who preferred to be anonymous, stated that:
“Last time I spoke to my colleague at NBC about it, he said they’re trading off aperture (and depth of field) against noise performance (averaging 4 pixels in camera lowers the noise).”
This observation adds another layer to the discussion. For live sports, where large depth of field and reduced noise are critical, compromises in camera settings can limit the benefits of native 4K production.
This raises a key question: if native 4K isn’t yet feasible, is upscaling the right alternative?
Does Upscaling to 4K Improve Quality?
One of the most insightful comments came from Alex Zambelli, who noted that delivering 4K adds significant cost with minimal perceptual benefit unless the scaling process adds meaningful detail to the picture.
As Zambelli explained, “All HD video distributed to UHD TVs gets upscaled to UHD by the playback device anyway, so upscaling it at encoding time doesn’t actually provide any real value to customers. In fact, it can be counterproductive: since higher resolutions require higher bitrates to maintain the same visual quality, upscaling video prior to encoding ends up unnecessarily increasing the content distributor’s CDN costs and requiring more bandwidth for smooth playback – all while providing no improvement in quality over the original HD signal.”
In a subsequent fact-checking conversation, several comments were added that aren’t in the original LinkedIn thread. Yuriy Reznik, VP Research at Brightcove observed, “the only argument that can still be made is that end-user device-level upscaling may be inconsistent. Some may be using sophisticated SR techniques, while others – bicubuc classics. So using encoder side high-end SR might improve quality for some. But longer term – it is indeed nonsense.”
Zambelli agreed, commenting, ” Yes, that’d be the one caveat: if the quality of the scaler used before encoding is significantly superior the quality of scalers found in TVs, there could be some advantage to upscaling the video there. But it’d need to be *significantly* better, like ML/AI-powered upscaling that adds high-frequency detail that could never be created by conventional scaling algorithms. But since in this case we’re talking about live encoding, I highly doubt that Fox is using anything like that.
Subhrendu Sarkar added, “Apart from the minimal benefits of removing inconsistencies of device-specific upscaling, upscaling before delivery (unless it ML or CNN based) has almost no picture quality improvements. That is precisely why I opined that the biggest benefit of doing this is the marketing claims they can make (see below).”
Concluding the fact-checking comments, Reznik reported that “At Brightcove…we detect if input is upscaled and then reduce outputs to true resolution. It is called “true resolution” mode in our CAE (Content Aware Encoding) product.”
Back in the LinkedIn commentary, Yoeri Geutskens observed that the HDR workflow itself introduces challenges, including that:
- While production workflows often rely on BT.2020 with HLG, North American distribution primarily uses PQ (Perceptual Quantizer).
- Geutskens noted that HLG is “almost NEVER used as a distribution format in North America.”
Mark Kogan agreed but pointed out that “fine-tuning” is often required to ensure transparency throughout the distribution chain.
The PR Value of 4K
Subhrendu Sarkar, Engineering Leader at Warner Bros. Discovery, added that “streaming in 4K” carries significant PR value, regardless of the technical trade-offs. For Fox, the decision to label the broadcast as “4K HDR” could be as much about audience perception as technical excellence.
Derek Smith from Disney added a lighter note:
“I guarantee this is so they can advertise, ‘In 4K!'”
This quip underscores the role marketing plays alongside technical considerations in decision-making.
Does the Bitrate Need to Be So High?
“HF is absent or fake. With HEVC, there are more effective ways to synthesize shaped HF noise than by encoding it as is.”
— Yuriy Reznik, VP Research at Brightcove
Yuriy Reznik argued that allocating a maximum bitrate of 15 Mbps for upscaled 4K may not be efficient. In his view, much of the high-frequency (HF) detail in upscaled content is synthetic and doesn’t warrant the same bitrate as native 4K. He suggested that modern codecs like HEVC offer tools to optimize such workflows, including adaptive quantization and perceptual tuning.
Mark Kogan proposed two alternatives: letting the TV scale upon delivery, as recommended above, and encoding with AV1. “It’s better to use 1080p60 all the way as a native source resolution with upscaled to hdr with bt2020 wide gamut rather than create artificial 4k …. Much better experience, in my view. 4k signals has better effect on larger screens, 75″ and above. But also would try AV1 codec rather than HEVC.. :).”
That said, while AV1 delivers better compression efficiency, its higher computational demands make it challenging for real-time applications like live broadcasting. Also, at least with hardware transcoders, the improvement that AV1 delivers over HEVC is minimal in many cases (see Moscow State University study here).
The Bigger Picture
“As usual, it’s a balancing act between technical excellence and justified business objectives.”
— Andrew Krupiczka, Senior Software Engineer
Fox’s decision reflects a delicate balance between infrastructure limitations, cost considerations, and viewer expectations. While upscaling 1080p HDR to 4K may not deliver the same quality as native 4K, it allows Fox to meet PR demands and offer an enhanced viewing experience without overhauling its production chain.
As IP workflows like SMPTE 2110 and more efficient codecs like AV1 or VVC become mainstream, the future of live sports production may lean toward native 4K. For now, Fox’s approach underscores the compromises broadcasters make to navigate the intersection of technology, cost, and audience perception.
Break a Leg, Fox
Fox has some exceptionally bright broadcast and streaming professionals onboard, and I’m sure they made the best technical decisions for their broadcast goals and infrastructure. Here’s hoping that Fox does a fabulous job with the broadcast and will share the reasoning behind their technical decisions at some later point. I know that all the commentators above share this sentiment.
All in all, I enjoyed your article.
I have several comments:
Fox’s Superbowl as broadcast on Fox Channel was in their standard 720p60 resolution like they use for all of their regular programming, not even 1080p60, which continues to be subpar (at the very best), for a Superbowl broadcast.
Secondly, for the broadcast of the Superbowl on their Tubi streaming channel, they heavily advertised this so-called 4K. But as you implied that it was upscaled from a 1080p source. Am I inclined to agree that it was even upscaled from a 1080p source as Fox continues to broadcast all of its programming at a 720p60 resolution and update rate. Upscaling from a 1080p(60) source to a 4K does not improve picture quality, it actually worsens it viewing the upscaled stream on a 4K native display as you cannot add additional picture resolution if it isn’t there to begin with. Also, since Fox continues to use an even worse resolution (since 2008 Fox has broadcast at an atrocious 720p60 resolution, as does ABC), making the image quality for an upscaled 4K image from the 720p60 source even worse.
I switched over to Tubi multiple times during the game (and during the halftime show) to see what the so-called “4K” image looked like.
It was bad, so bad that I could not watch it.
I switched back to the normal 720p60 broadcast on Fox and watched it. It was bad, but not as bad as the Tubi broadcast, my LG OLED HDTV, like all major 4k HDTV manufacturers TVs, includes 4K upscaling to its native TVs’ 4K resolution. And in my humble opinion, my TVs case appeared better than the Tubi broadcast.
Going on 17 years now, Fox has refused to upgrade to a higher and better broadcast resolution (at least get to 1080p60 for crying out loud), under the guise that the underlying support infrastructure of affiliate Fox channels around the country does not support any higher resolution, and that their “typical” viewers are mainly over-the-air viewers and do not own HDTVs that will support the necessary ATSC format to utilize a higher resolution (what a bunch of bullock). Fox (Murdoch) is simply too cheap to spend the money for it. I guess until the Fox viewers in numbers make their voices heard, nothing will change, and Fox is probably right. Most likely 50 years from now Fox will still be broadcasting in 720p60.
Fox touted the following in an article released in Forbes ” … To cap its 30th season broadcasting the NFL, Fox will televise its 11th Super Bowl.
And for Super Bowl LIX, the network is rolling out some exciting, new gadgetry and the most cameras it ever has used for a Super Bowl.
For a typical game, Fox only has 35 to 40 cameras for the action. But for Super Bowl LIX, it will have 147 total cameras, including 85 game cameras — 27 super slow motion, 23 high resolution, 16 robotic, 10 wireless and two SkyCams.
Though that number is impressive, it can be a bit misleading because you only need so many cameras for live game action.
But a major benefit of the additional cameras is instant replay.
Fox is particularly enthused about its high SkyCam providing the payload for a Sony HDC-P50A 4K box camera. That will provide a great analytical view for Tom Brady because it has higher resolution, shoots at a higher frame rate and can show a wider view of the field. …”
This touting of technology was a bunch of crap since the broadcast could not use that higher resolution from those cameras, and more importantly since during the broadcast, Fox for some bizarre unknown reason chose to show very, very few replays which was very maddening and frustrating. This was another reason to not watch the Tubi stream as the Tubi stream was just a live stream and could not be paused and rewound to look at another look at a play. I found myself rewinding and rewatching many plays for which Fox chose to not show.
You stated towards the end of your article, “While upscaling 1080p HDR to 4K may not deliver the same quality as native 4K, it allows Fox to meet PR demands and offer an enhanced viewing experience without overhauling its production chain.”
I hear what you are saying and one could agree with it at face value.
But, I believe Fox looks at this “Public Perception” as how we can, with very little cost or no cost, pull one over again and fool the viewing public. The past several years we have seen this from Fox time and time again as they manipulate and brainwash their viewers to fit their agenda and bottom line. Fox really does not care about their viewers as long as they can keep them tuning in.
All-in-all a very, very frustrating and unenjoyable Super Bowl broadcast (also for the fact that my beloved Chiefs were totally outplayed by the Eagles, my hat’s off to the Eagles). The NFL, with all of its clout and money it makes for the NFL and for the broadcasting companies should stipulate a higher, minimum broadcast video resolution (minimum of 1080p60) for all of its games, but especially for the Superbowl, one of the most watched sporting events around the entire world!
“Fox’s Superbowl as broadcast on Fox Channel was in their standard 720p60 resolution like they use for all of their regular programming, not even 1080p60, which continues to be subpar (at the very best), for a Superbowl broadcast.”
Fox’s ATSC 1.0 stream is 720p, but their ATSC 3.0 stream is 1080p60!