Does Anyone Care About Constrained VBR Anymore?

I’m comparing cloud-based per-title VOD encoding technologies for a report I hope to publish in the near term. In an attempt to create a level playing field I set the following specifications.

  • 2-second GOP
  • 200% constrained VBR
  • 2-second VBV buffer

Kind of like you were encoding with FFmpeg and using these parameters

-b:v 2600k -maxrate 5200k -bufsize 5200k

All cloud services let you specify these parameters when encoding using a pre-defined ladder, but several don’t let you specify with their per-title features; instead, they configure them automatically. Those that let you set these parameters may exceed the target maximum bitrate, just like FFmpeg occasionally does.

One particular file stressed out the encoders more than others, producing the results shown in Table 1. Service B and E let you specify the max bitrate with per-title encodes; E hit the target and B didn’t. Obviously, FFmpeg does and was well over the target.

A, C, and D don’t let you specify; Service A is very high, Service C below the desired level, and Service D a bit over. The bitrates are all different because these are per-title encodes and each service produces a unique encoding ladder.

In comparing quality, I’m looking at average VMAF, quality variability, and low-frame VMAF. This issue arose because Service E suffered a low-quality region in this particular file, and Service B didn’t, and the low-quality region was right where Service B’s data rate spiked. This is what you see in the Results Plot below, which tracks the per-frame VMAF scores for Service E’s frames in red and Service B’s frames in green.

Service E met the 200% constraint requirements, but the quality dropped in the challenging region. Service B exceeded the target and the output quality was higher.

Back in the day, we deployed constrained VBR because it helped ensure smooth delivery to lower-performing connections.  That still may be true for 4K/8K files encoded at 40 Mbps, but probably not for files in the 3-6 Mbps range to the typical target.

If we don’t care about constraining the data rate anymore, I should reencode with Service E without any maximum bitrate limit. If we do, I should reencode with Service B with increasingly constricted settings until the limit is met, or if that’s not possible, caveat the quality scores of all services that didn’t meet the 200% target.

Any thoughts? I’d appreciate any answers to the following questions:

  • Do you care about constrained data rates anymore?
  • If so, what percentage constraint do you use? 150%? 200% 400%?
  • How valuable a feature in a per-title service is the ability to set a constraint level and have the service meet it?

You can answer as a comment or email me at [email protected] All emailed answers will be kept confidential unless you agree otherwise. Thanks!

About Jan Ozer

Avatar photo
I help companies train new technical hires in streaming media-related positions; I also help companies optimize their codec selections and encoding stacks and evaluate new encoders and codecs. I am a contributing editor to Streaming Media Magazine, writing about codecs and encoding tools. I have written multiple authoritative books on video encoding, including Video Encoding by the Numbers: Eliminate the Guesswork from your Streaming Video ( and Learn to Produce Video with FFmpeg: In Thirty Minutes or Less ( I have multiple courses relating to streaming media production, all available at I currently work as as a Senior Director in Marketing.

Check Also

Ozer to Present on x265 Encoding and Packaging at Mile-High Video

You can download the presentation handout here (download). I’ll add the video once it becomes …

NETINT will be in Booth W1672 at the upcoming NAB

Come See Me (Jan Ozer) At NAB

NAB is coming up April 15 – 19 in Las Vegas. I’ll be attending with …

How Firmware Helps Keep ASICs Up to Date

ASICs have an unfair (and inaccurate) reputation for being obsolete the day they leave the …