Constrained VBR Levels of the Rich and Famous

I’ve been wondering if publishers still care about constraining their maximum bitrates (see here for more background), so I grabbed my copy of youtube-dl and performed a little study. The TL/DR version seems to be that new media sites like YouTube, Facebook, and Vimeo don’t strictly adhere to the typical 200% constrained VBR limit while traditional publishers seem to, though data is definitely limited in the latter case.

Also noteworthy is that I saw no VP9 usage outside of YouTube and Facebook, and no AV1 usage outside of YouTube and Vimeo. However, my search wasn’t exhaustive and didn’t include Netflix, a known user of VP9 and AV1. Of course, I didn’t see any HEVC either but didn’t expect to for reasons explained at the bottom.

How I Tested

Here’s what I did. I visited a site, tested if youtube-dl worked, and started downloading files if it did. I tried to find files with as much motion as possible to stress the codec; a simple talking head obviously won’t cause any data rate spikes. I tested a minimum of three clips per site, which is admittedly a small sample. Once I downloaded a file, I measured the average and max bitrate in Elecard StreamEye and grabbed the resolution and codec used as well (Figure 1).

Figure 1. I used Elecard StreamEye to grab codec and bitrate data.

In the tables below, you’ll see the resolution, codec, and average and max bitrate of the files, plus the multiple of average bitrate to max bitrate. Under the assumption that anything under 200% constrained VBR is “safe,” I formatted the multiples as follows:

  • Yellow – 2-3x
  • Light orange – 3-4x
  • Light mauve – over 4x

Let’s jump in.

Why Care About Constrained VBR.

By way of background, this whole max-bitrate issue arose because of test conditions for an upcoming report on cloud-based per-title encoding capabilities. Specifically, I attempted to implement 200% constrained VBR for all services. One test clip, a short snippet of Tears of Steel (TOS), showed the following.

  • Service B exceeded the 200% constrained VBR limit, as shown in Figure 2, and maintained good quality.
  • Service E maintained the limit, but quality suffered around 30 seconds in.
Figure 2. Service B exceeded 200% constrained VBR, and maintained quality; Service E maintained 200% constrained VBR, but quality suffered.

Here are data rate views from Bitrate Viewer which shows slightly different findings on average and maximum bitrate than StreamEye. This is Service B, which spikes the data rate around 30 seconds in, right where the quality drops for Service E.

Figure 3. Here’s the per-second data rate in Bitrate Viewer; note the spike right at the spot of the quality drop.

Here’s Service E which is clearly much more regulated and shows no similar spike.

Figure 4. Here’s the per-second data rate in Bitrate Viewer; the average bitrate is clearly more tightly constrained.

This raised several questions:

  • Should you reward Service B for maintaining quality, or Service E for honoring the restraint?
  • Is 200% constrained VBR the right number for files encoded at these bitrates (around 2.5 Mbps)? Or is 400% better?

YouTube: Maybe Constrained VBR

I thought the answer would be found by asking WWYTD (what would YouTube do)? So I uploaded the file to YouTube and found a very curious result. Specifically:

  • YouTube performed almost exactly the same as Service E, maintaining bitrate control on this clip (230% constrained VBR) and showing the same drop in quality.
  • This occurred despite the fact that YouTube greatly exceeded the 200% constraint mark on many other clips.
Figure 5. YouTube appears to be maintaining a bitrate constraint with is AVC encode just like Service E.

Here’s a Bitrate Viewer view of the YouTube file. Though the bitrate spikes a bit in the problem area, it’s nowhere near as high as Service B.

Figure 6. Bitrate Viewer showing the YouTube encode.

Looking at Bitrate viewer, it appears that YouTube constrains the maximum bitrate, though it’s impossible to tell for sure. I ran the same analysis with the VP9 file and found the same result which you can see at full resolution by clicking the image.

Figure 7. YouTube appears to be maintaining a bitrate constraint with this VP9 encode just like Service E.

After noting that YouTube seemed to respect the constraint limitation, I sorted the YouTube clips by the codec and found the following result.

  • Average maximum of AV1 – 3.5x, with a maximum of 4.9x
  • Average maximum for AVC – 2.5x, with a maximum of 3.5x
  • Average maximum for VP9 – 2.3x, with a maximum of 3.7x
Table 1. The average constraint multiple for YouTube was 2.8, but this was increased by AV1 variations.

I’m not sure why AV1 would have a larger multiple; it could be that the AV1 bitrate simply varies more than AVC or VP9, it could be a different encoding command string, or it could be something completely different. But it appears that at least for AVC and VP9, YouTube is respecting 200%-ish constraints, though there are outliers for both codecs that may indicate otherwise.

It’s also significant that the 8K soccer clip that YouTube encoded has a maximum bitrate of 3.6x the average bitrate. One logical strategy might be to avoid strict maximum bitrate limits on lower resolution (and lower bitrate) files where bitrate swings probably won’t cause a problem but apply a tighter cap on higher resolution (and higher bitrate files) where the bitrate spikes could interrupt playback. YouTube doesn’t appear to be doing that. I’m not saying that it should, just observing that a 52.5 Mpbs delta between average and maximum is a scary number.

Vimeo: Definitely Constrained VBR (via Capped CRF)

Table 2 shows the data for Vimeo.

Table 2. Vimeo data.

As a UGC site, I was able to upload the same problem file to Vimeo and gauge the output. Note that the problem doesn’t look as severe as in Figure 3, but that’s because Vimeo’s data rate was 42% higher and its maximum rate was 10% higher than YouTube.

Figure 8. Vimeo shows a similar effect indicating that a bitrate cap is at work.

With Vimeo, we don’t have to guess what the constraint is, we can check in MediaInfo (HTML view) and see the following (Figure 9). As I’ve observed before, Vimeo is using capped CRF, in essence using the following FFmpeg command string:

ffmpeg -i TOS.mp4 -crf 20 -maxrate 5500k -preset slow -bufsize 15000k -an TOS_Vimeo.mp4

This tells FFmpeg to encode at a CRF value of 20, but cap the bitrate at 5,500 kbps with an approximate 3-second buffer. A large buffer like this lets the data rate swing up to 9766 kbps. To get a sense of the impact of the buffer on the bitrate maximum, I duplicated Vimeo’s encoding parameters with a 5000 buffer size and the maximum bitrate dropped to 7242 kbps.

Figure 9. Vimeo uses capped CRF as shown in MediaInfo.Here’s what the Tears of Steel bitrate looks like in Bitrate Viewer; it appears that a consistent constraint is being applied.

Figure 10. It appears that Vimeo is applying maximum bitrate constraints here.

Looking back at Table 2, the Friend of a Friend video used the same formula as TOS and topped out at 2.3x. Rise did as well and matched TOS at  1.8x. I’m not sure what happened with the Sense of Flying video, as its encoding settings weren’t available in MediaInfo.

Despite the one outlier, it appears that Vimeo is pretty serious about maintaining a reasonable bitrate cap on most videos.

Facebook: No Apparent Constraints

Table 3 shows the Facebook results, with Tears of Steel (TOS) at the very top. Facebook uses an AI-based algorithm to determine the encoding parameters used for each clip, and mine didn’t rate a VP9 or even 1080p encode. Instead, Facebook produced a 720p AVC version with a maximum bitrate almost 4 times larger than the average.

Table 3. Facebook results show minimal concern for bitrate variations.

Here’s the Bitrate Viewer view, with the spike right at the problem area.

Figure 11. Bitrate Viewer seems to indicate that Facebook isn’t strictly controlling the maximum bitrate.

I’ve included the Facebook quality graph in Figure 12; click it to have a look. You’ll see that while there is a quality hit at the 30-second mark, it’s pretty much in line with the entire clip, where Facebook’s 720p version is lower in quality than Service Provider B, which was encoded at a higher resolution and 20% higher bitrate.

Figure 12. Facebook avoided the huge quality drop seen with other encodes that maintained the maximum bitrate.

Otherwise, Facebook serves a variety of form factors and codecs. Despite 65% of video views coming from mobile viewers, Facebook’s average multiple over the average bitrate is 3.8x with a maximum of 9.5x. Facebook may be applying some constraints, but they are clearly well beyond 200% constrained VBR.

Three-Letter Network: Definitely Constrained VBR

I was able to download videos from one of the three-letter networks, I won’t disclose which. I downloaded three files, two action shows from prime time, and one soap opera. As you can see from Table 4, while the data rates are surprisingly aggressive, the bitrate caps are very conservative and well within the 200% max.

Table 4: Very conservative results from a 3-letter network.

Apple Movie Trailers: No Apparent Constraints

All categories till now have been video delivered via some form of adaptive bitrate delivery; Apple’s videos, which were downloaded in MOV format, appear to be a single file played back via progressive streaming. The data in the table appears to show that Apple’s encoding strings don’t attempt to keep the maximum data rate within 2x or a similar multiple.

Table 5. Apple doesn’t seem overly concerned about bitrate maximum in these files.

CNN: Definitely Constrained VBR

CNN has really boosted the quality of their videos; last time I looked they were broadcasting in the 1200 kbps range. Now the service is maxing out at 8 Mbps and 1080p. Very impressive for an AVOD service.

Like ESPN, much of CNN’s video is recorded live, and I don’t know if they are reencoded for VOD. Still, three out of four files are within the 2x limit, and the outlier Texas Teacher video is all slides, and looks like CNN exceeded 2x because the overall bitrate was low, not because the max bitrate was high.

Table 6. CNN appears pretty conservative with its maximum bitrates.

ESPN: NA

Speaking of ESPN, all-white means that the service is under 2x for all tested videos. From the consistent data rates and low maximums, I’m guessing that these are live videos that must be encoded with much less variation, which means that they don’t have much bearing on this conversation. I’m including them for the sake of completeness and they are not included in the average figures presented below.

Table 7. ESPN’s videos are probably live videos, so I’ve excluded them from the overall averages.

News Organizations: Mixed Bag, but no Tight Constraints

I downloaded videos from three prominent newspapers. I confirmed that the New York Times and Washington Post both delivered their files via HLS, but wasn’t able to verify this for the Wall Street Journal.

The New York Times delivers very high bitrate video with excellent quality. Though three of the four clips were under the 2x% constraint, the maximum bitrate from the first clip seems to indicate that controlling the maximum bitrate is not a priority.

The Wall Street Journal is much more aggressive with its encoding and doesn’t appear concerned about maintaining a strict 2x% constraint.

The Washington Post is the most aggressive bitrate-wise, with all clips over 2x and two just under 4x.

Table 8. Video from three prominent newspapers.

Here are the average figures, though I’m not sure how much they add to the conversation.

Table 9. Average figures (excluding CNN).

Table 10 summarizes the results from each service. It seems abundantly clear that most of these providers, when distributing to web browsers, don’t prioritize maintaining bitrate control over their video files, at least to nowhere near the 200% level.

Table 10. Service provider summary.

To be clear, however, I’m not saying that these providers use the same strategy when distributing to Smart TVs, dongles, or other devices with less CPU/memory than the average smartphone. I downloaded most files with youtube-dl, which according to this post, acts like a web browser. So, I didn’t download any videos that were encoded for TV distribution. That’s why it was no surprise that I saw no HEVC downloads.

TVs and similar devices present different issues than computers or even mobile devices. Here’s a comment from video engineer Derek Prestegard from this LinkedIn post.

Some devices like cost-optimized smart TVs have very small amounts of memory available for buffering video segments. When pushing relatively high bitrates like 15+ Mbps for UHD, you’re often limited to only a few seconds of buffer.

You really need to be careful here because overflowing this buffer can cause a really nasty crash that requires unplugging the TV to recover from. I have first hand experience dealing with an extremely popular brand of TVs that are guilty of such spectacular behavior.

As a counterpoint, it seems unlikely that YouTube encodes separately for smart TVs, and certainly, YouTube has its share of TV viewers (more than 120 million in the US watch YouTube on Smart TVs). If YouTube’s encoding practices were causing problems, they would have changed the practice. On the other hand, YouTube doesn’t encode to HEVC, and perhaps it’s problems with HEVC playback that Prestegard is reporting.

The Bottom Line

Here’s the bottom line.

  1.  Constraining the maximum bitrate can introduce transient quality problems in your videos.
  2. An unconstrained stream can introduce playback issues, particularly on lower-power devices.
  3. It appears that many high-profile sites don’t implement strict bitrate control when encoding video to be played in the browser.
  4. If you’re distributing primarily to the living room, none of the data in this article is particularly relevant, except perhaps what you see for the three-letter network.

I’ll conclude by saying that its always risky to draw conclusions from insufficient data, or from good data not particularly well analysized. If you see any obvious errors or faulty conclusions, please let me know at [email protected].

About Jan Ozer

Avatar photo
I help companies train new technical hires in streaming media-related positions; I also help companies optimize their codec selections and encoding stacks and evaluate new encoders and codecs. I am a contributing editor to Streaming Media Magazine, writing about codecs and encoding tools. I have written multiple authoritative books on video encoding, including Video Encoding by the Numbers: Eliminate the Guesswork from your Streaming Video (https://amzn.to/3kV6R1j) and Learn to Produce Video with FFmpeg: In Thirty Minutes or Less (https://amzn.to/3ZJih7e). I have multiple courses relating to streaming media production, all available at https://bit.ly/slc_courses. I currently work as www.netint.com as a Senior Director in Marketing.

Check Also

Simplify Your Workflow: Command-Line Variables in FFmpeg Batch Files

Creating batch files with variables is one of the more efficient ways to run FFmpeg. …

JPEG AI Is Coming: What You Need to Know

This article provides an overview of JPEG AI, which delivers superior compression efficiency and improved …

New free course on LCEVC Enhancement

Free Course On LCEVC Video-Enhancement

I’m thrilled to introduce my newest course on LCEVC video enhancement! It’s designed to introduce …

Leave a Reply

Your email address will not be published. Required fields are marked *