Configuring Your Streaming Video (for Newbies)

Home/Articles/Configuring Your Streaming Video (for Newbies)
By | 2017-02-23T00:57:38+00:00 November 29th, 2011|Articles|Comments Off on Configuring Your Streaming Video (for Newbies)

Configuring your video streams properly requires an understanding of three concepts; data rate, resolution and frame rate. In this article, I’ll define these terms and discuss the influences that impact your choices for each parameter. Then, at the end, I’ll walk you through a decision matrix designed to help you choose the optimal parameters for your streaming video.

While this is designed for newbies, most of the concepts discussed will be valuable to all streaming producers, particularly the information regarding the average bitrates used by broadcast, business-to-consumer and business-to-business sites. So, have a glance at the table of contents shown below, and let’s get started.

 What is Data Rate?

Data rate (or bit rate) is the amount of data per second of video, usually expressed in kilobits (kbps) or megabits per second (Mbps). When I say that ESPN distributes their video at 800 kbps, this means that each one-second chunk of audio and video comprises about 800 kilobits of data.

Typically, when configuring the data rate in an encoding tool, you enter the video and audio data rates separately. That’s what you see in Figure 1, which are configuration screens from Sorenson Squeeze. That’s video on the left, audio on the right.

data rate 1.PNG

Figure 1. Choosing data rates for video and audio in Sorenson Squeeze.

Data rate is the most important factor in streaming video quality. That’s because all streaming codecs use what’s called “lossy” compression, which means that the more you compress, the more quality you lose. For this reason, all other file characteristics (like resolution, frame rate or codec) being equal, the lower the data rate, the lower the quality of the compressed file.

Understanding Data Rate

While data rate is an absolute number that you configure into your streaming encoding software, intuitively, we understand that it’s a relative concept. That is, a data rate of 500 kbps would look great if your video is configured at a resolution of 320×240, but would look awful if configured at 1920×1080. Adjusting the frame rate from 30 frames per second to 15 frames per second can also impact the actual quality of the video. For this reason, rather than evaluating data rate itself, it’s more useful to look at a value called bits per pixel, which incorporates data rate, frame rate and video resolution to arrive at a single value.

You calculate bits per pixel by dividing the per-second video data rate by the number of pixels per second in the video file. You calculate the number of pixels per second by multiplying video height x width x frame rate. For example, if a video file had a resolution of 640×360, a frame rate of 30 and a data rate of 670 kbps, the calculation would look like this:


670,000/ 6,912,000 = .097

Simply stated, there are 6,912,000 pixels per second in the video. Divide the per-second video data rate (670,000) by that number and you get .097 bits per pixel. In essence, this tells you that each pixel of video data has .097 bits assigned to it, which is an absolute value describes how much compression is actually applied to the video file. 

If you’re not mathematically inclined, there’s a very convenient, free tool called MediaInfo that runs on Windows, Mac and Linux platforms and is installed on every one of the computers in my office. You can watch a tutorial about MediaInfo and Bitrate Viewer, another tool I use a lot, here. As you can see in the screen shot below, MediaInfo provides a ton of file-specific data, including the bits-per-pixel value, which it mislabels as Bits(Pixel*Frame). No matter, same math, same result.


Figure 2. MediaInfo provides tons of file-specific data, including the bits-per-pixel value of the file. 

In general, for low-motion, talking-head video, bits-per-pixel values in the range of .1 – .15 should produce very good video quality. For example, CNN produces most of their videos at around .1. In fact, that’s a file from CNN that I’m analyzing in Figure 2. For higher-motion videos, you need a bits-per-pixel value of around .15 -.20. For example, ESPN produces most of their videos in this range.

Why the difference? Because videos with high motion, or lots of detail, are harder to compress than low-motion videos. That’s why talking-head videos encode at higher quality than soccer matches at the same bits-per-pixel value. You can see this in the video below, where the first section contains low-motion clips, the second-high motion clips and the third high-motion clips with high detail.

Obviously, the clip was encoded using the same data rate throughout, but if you pause the video periodically, you’ll notice that the low-motion frames look a lot better. In particular, the horse-riding sequence, which includes lots of action and camera panning, becomes very blocky, while the high-motion/high-detail clip at the very end becomes very pixelated. Incidentally, I produced this clip at a resolution of 640×360, with a data rate of 500 kbps, for a bits-per-pixel value of .072, which is aggressive, but not unheard of.