Configuring Windows Workstation for Premiere Pro CS5.5

This article is the first of a series on configuring your Windows workstation for producing with Adobe CS 5.5. The focus of this particular article is buying a single CPU vs. dual CPU system for producing with Premiere Pro and Adobe Media Encoder. I’ll also cover whether it makes sense to enable or disable HTT (hyper-threaded technology) when available on your workstation.

Z400.png

Figure 1. The HP Z400, our single CPU, 4/8-core contestant.

As an overview, I tested with two similarly configured systems from HP, one with a 2.67 GHz 4-core CPU (an HP Z400), the other with two 2.67 GHz 4-core CPUs (an HP Z600). Both systems had 24 GB of RAM, so memory should not have been a limitation on either system, and both used the same graphics card, an NVIDIA Quadro FX 4800. 

I ran multiple tests, encoding sequences created from different camera formats, from DV to Red, into multiple outputs, from MPEG-2 for DVD to H.264 for YouTube and Blu-ray. I ran the first series of tests in dedicated mode, with nothing else happening on the computer. Then I ran a second series of tests with Adobe Encore rendering a very long file to H.264 Blu-ray format.

Z600f.png

Figure 2. The HP Z600, our dual=CPU, 8/16-core contestant.

Table 1 shows the results, which I’ll explain further below. To see the detailed test results, click here. Lots of synthesized information in Table 1, so let’s take a moment to explain.

The first column shows, on a percentage basis, how much longer it would have taken to render the sequences on a single CPU (the Z400) rather than the dual CPU system (the Z600). Positive numbers mean it would have taken longer on the Z400; negative numbers means the dual-core system was slower.

Table 1n.png

Table 1. Format by format analysis of dual vs single CPU encoding.

I color-coded the results so that all negative numbers were highlighted in red, meaning that they are bad results. Obviously, you don’t want to spend the extra dough for a second processor if it will actually slow production. At the other end of the spectrum, if the dual CPU system would improve encoding time by over 24%, the figure was highlighted in green, which obviously means good. Granted, 24% is totally arbitrary, but it’s a meaningful number in a production environment.

The second column shows the same analysis with Encore rendering the Blu-ray project in the background. Beyond the performance numbers shown in Table 1, the other critical fact is that the dual-CPU Z600 finished the Blu-ray project in 1:42 (hour:min) while the single CPU-Z400 took 2:39, almost an hour longer (or 56%). If you frequently multi-task and render one project while editing another, this is the column that you should focus on, and the dominance of green means a second CPU could be very worthwhile.

Just so you know, I tested both systems with HTT enabled and disabled. As I’ll discuss later, I learned that it makes the most sense to run both systems with HTT enabled, so all encodes reported in Table 1 were produced with HTT enabled.

Overall, if you’re working with older formats, like DV and HDV, and you don’t multi-task, a dual-CPU workstation probably doesn’t make much sense. Surprisingly AVCHD doesn’t benefit that much from the extra cores, and neither does DVCPRO HD. On the other hand, even in dedicated mode, DSLR footage from the Canon 7D and 5D was accelerated by 41%, XDCAM-EX footage by 40% and Red footage sucked up the extra cores most efficiently, benefiting by 74%.

Obviously, if you multi-task, a dual CPU system delivers significant benefits in all formats, and saves significant time on whatever background task that you’re rendering at the time. Again, while performing the same series of encodes in the Premiere Pro/Adobe Media Encoder, the dual-CPU Z600 finished producing the Blu-ray project almost an hour faster than the single-core Z400, or 56% faster.

Analysis

Why so much deviance in the results? Well, let’s start with an overview of our projects, which contain a mix of real world and synthetic projects. Real world, obviously, means jobs that I or others have produced, while synthetic means projects created just to test performance with the specified format.

Real world projects are good because they reflect how the formats and editors are used in the real world, though they are necessarily idiosyncratic. Synthetic projects are nice, but may not reflect how the formats or tools operate in the real world, and can be constructed to prove any point that you want to make. It’s the classic benchmarking Catch-22.

So, admittedly, at least to some degree, the difference in performance between the formats will likely relate to the differences in the various projects that I tested. Again, to mitigate against this, I used a mix of real world and synthetic tests, with the synthetic tests very simple projects that simply rendered a one or two track project in that particular format. Note that the spreadsheet that contains the results also contains brief project descriptions.

The other performance difference between the codecs relates to how efficiently each codec works in a multiple-core environment. To explain, to utilize multiple processing units, such as cores in a modern CPU, computer code has to be multi-threaded, or written in a way that allows it to split the work up among multiple CPUs. Code that’s not multi-threaded works the same on a single-core or 16-core system, because the software routines simply can’t divide up the work.

Within Premiere Pro, software routines, typically licensed from other vendors like MainConcept, are in charge of encoding and decoding the various formats supported by the program. These routines vary in how efficiently they work with multiple CPUs. For example, the DV format has been around since before multiple-core computers existed, and is an SD format that most systems can handle with ease. It’s not a squeaky wheel that customers are screaming about. So if you were a codec vendor, you probably wouldn’t spend much time improving the multi-threaded efficiency of your DV codec.

What would that look like in the context of our experiments? Well, have a look at Figure 3. This is a tool called Performance Monitor that shows CPU utilization over time, and I used Performance Monitor during all encodes to see how efficiently the CPU cores were being utilized. In Figure 3, I combined the CPU utilization graph from the two computers-Z400 on the left, Z600 on the right-so you could compare the results.

The project encoding at the time was a synthetic file consisting ten minutes of DV footage with a simple logo overlay, encoded to DVD compatible MPEG-2 format using the NTSC High Quality Adobe Media Encoder preset. On the left, you can see that the 8-core Z400 used up to 70% of the available CPU processing power, while the 16-core Z600 topped out at just below 40%. Despite having twice the processing power, it’s easy to see why the Z600 was only slightly faster – it was nearly 50% less efficient.

Figure 1.png

Figure 3. CPU efficiency while rendering DV footage to MPEG-2 format.

Contrast this with Figure 4, which shows the CPU efficiency while encoding a 1:20 (min:sec) Red test clip to H.264 for Blu-ray. If you study the figure closely, you’ll see that both systems were red-lined at the top of the chart, at 100% CPU utilization, which is why the Z600 produced the file over 70% faster. Whoever developed the Red codec produced a marvelously multi-threaded piece of code.

 Figure 2.png

Figure 4. CPU efficiency while rendering Red footage to H.264 format.

If you looked at the comparative Performance Monitor charts for the XDCAM – EX format, you’d see that the Z600 also peaked at close to 100% efficiency, with the single CPU Z400 proving about 60% slower in both of these tests. If you’re producing in either the Red or XDCAM – EX formats, or the H.264-based codecs used in the Canon 5D and 7D, more cores definitely seem to be better.

Output Codec

I also found that Premiere Pro was more efficient when producing H.264 rather than MPEG-2, which is shown in Figure 5. On the left, with the XDCAM HD project, you see that CPU utilization peaked at about 65% for H.264, while never crossing the 40% mark for MPEG-2. Same story to a lesser degree with AVC-Intra on the right.

XD Cam.png

Figure 5. The output codec also impacts CPU efficiency.

Again, no shock here, as MPEG-2 is another Stone Age codec that’s probably not seeing a lot of enhancement, while H.264 is still relatively new and is continually being improved. The bottom line is that you can expect more benefit from a multiple-CPU system when outputting H.264 rather than MPEG-2.

What about HTT?

HTT, or hyper-threading technology, which is a feature on some, but not all Intel CPUs, adds components of a second processor core to each CPU core, which means that Windows Task Manager shows 16 active cores on the Z600 when HTT is enabled (Figure 5), and eight on the Z400. When CPU utilization is low with HTT enabled, as we saw with the DV codec, you have to wonder if performance with HTT disabled would be faster.

Task Manager - Z600 - Red.PNG

Figure 5. The Z600 with HTT enabled, happily crunching the Red projects at 100% utilization. 

Table 2 shows my analysis of where and when HTT is beneficial. To be clear, these results compare performance in the specified formats with HTT enabled and disabled, with no other processes running. Positive numbers in red indicate that the system was faster with HTT disabled (HTT is bad), while negative numbers in green show where performance is improved with HTT (HTT is good).

Table 3n.png

Table 2. Performance with HTT enabled and disabled.

Though there are lots of red numbers for the Z600, none of them are that significant, so you probably wouldn’t disable HTT to harvest these small gains, particularly if you were running other processes in the background, which your system could more efficiently handle with HTT enabled. The predominance of green the Z400 column indicates that most codecs work more efficiently with eight cores than with four.

Summary and Conclusions

Understand that while my test results are instructive, they’re certainly not universal. To use two old saws, do try this at home, and your mileage may vary. Test with your own clips, and expect some surprises.

I think that you can safely assume that most older formats won’t benefit that significantly from a dual-CPU system if you’re performing a single task only, while those editing in Red, XDCAM EX and DSLR formats will likely see some significant benefits from a dual-CPU system, even when not multi-tasking. If you render while you work, you should strongly consider a dual-CPU system irrespective of the formats that you work with.

How I Tested

Project details and performance numbers are all available here. Here’s some additional information.

Note that when calculating scores for a particular format, I only included tests that exclusively used that format. For this reason, projects like the JC Weaver concert performance, which mixed HDV and ProRes converted from AVCHD, was not counted in either the HDV or AVCHD results. Similarly, the Behind the Scenes clip, which was primarily XDCAM-EX, with lots of B-roll in DSLR, AVC-Intra and Red formats, was not included in the XDCAM-EX results.

There are lots and lots and lots of numbers in this review, and I’m working alone, so there are likely to be some mistakes. If you notice anything that looks out of whack, drop me a note at [email protected]. This is also the first time I’ve published a spreadsheet via Google docs. If anything gets funky with the referenced spreadsheet, drop me a note as well.

Additional Resources

Here are some other workstation-related articles that I’ve recently written that you also might find helpful if looking for a workstation for editing.

HP Six-core Z400 Test Drive – Those users considering an HP workstation now have four units to choose from: the top-of-the-line dual-processor Z800 and Z600, which both sport the fancy new case that HP introduced last year, and the less-expensive Z400 and Z200, both single-CPU computers that feature updated innards, but the legacy case from the older xw workstation line. The Z400 starts at $999 for a dual-core model, while the very small-form-factor Z200 starts at $699. In contrast, the Z600 starts at $1,829 and the Z800 at $2,109—both for single-processor systems—and scale much, much higher. (October, 2010, Digital Content Producer Magazine)

HP Z800 Workstation With Intel Westmere Dual Six-Core Processor Review – in 2009, Intel launched its Nehalem line of workstations, which started with three models: the low-end Z400, the mid-range Z600, and the high-end Z800; later it was supplemented by the entry-level Z200. I had a look at the Z400, a single CPU quad-core, and the Z800, a dual-processor, quad-core system. Now HP is updating its workstation line to incorporate Intel’s new Westmere processor, which uses 32nm manufacturing technology to enable six cores on each CPU. HP sent me one of the first dual-processor six-core Z800 systems off the line, and I had about 2 days to run it through its paces for various digital video-processing tasks. (June 13, 2010)

Liquid-cooled HP Z800 Workstation Test Drive – I produce a lot of screencams and other narration-type recordings, and workstation noise is a constant concern. I also have multiple computers around my office, most off testing some software program or rendering some project. While “cacophony” is definitely too strong a word to apply, less noise is always good. For this reason, I was excited when HP called to offer a quick spin with its new liquid-cooled Z800 workstation. (July 28, 2009)

What Makes a Workstation a Workstation – My visit to HP – HP invited a bunch of journalists, myself included, out to visit their facility in Fort Collins, CO, the headquarters for workstation design, support and marketing. Beyond the desire to meet and greet friends old and new, I had one goal – to learn what makes a computer a workstation. (November 6, 2010)

Hewlett Packard’s Nehalem-based Z400 and Z800 speed encoding performance – On March 30, 2009, Hewlett Packard announced three new workstations that leverage Intel’s new Nehalem line of CPUs. To assess the significance of these new computers to the streaming market, I tested two Nehalem-based systems against older generation dual core, quad-core and eight core systems, using a range of encoding programs, including Adobe Media Encoder, On2 Flix Pro, Rhozet Carbon Coder, Sorenson Squeeze and Telestream Episode. (November 30, 2009)

Test Drive: Intel Nehalem, Part 1 – A few months ago, I ran some Adobe Creative Suite 4 (CS4) benchmarks on different computers that isolated how CS4 performed with formats ranging from DV to Red. Now that Intel’s Nehalem processor is upon us, those numbers are obsolete, so I’m updating them with results from two Nehalem-based workstations that I’ve been testing. In this installment, I’ll explain the tests and share DV and HDV results; next time, I’ll present the results for AVCHD, DVCPRO HD, and Red. Click to the main article to read the rest of the story. (June 8, 2009) 

Test Drive: Intel Nehalem, Part 2 – Welcome back to our presentation of how HP’s new Intel Nehalem-based workstations compare to older workstations when rendering from Adobe Creative Suite 4 (CS4). Briefly, in the last installment, I detailed the tests that I performed, and discussed the results for DV and HDV source materials. This time out, I present the results for DVCPRO HD, AVCHD, and Red and share how the Z400 and Z800 performed with Hyper-threaded Technology (HTT) enabled and disabled. Click to the main article to read the rest of the story. (April 27, 2009)

HP Z800 Review – Since the launch of the Core 2 Duo line of processors in mid-2006, new workstations have been more about evolution than revolution, with solid incremental but uninspiring performance gains. This is no longer. Sporting a completely redesigned case and Intel’s new Nehalem processor the new Z800 knocks the socks off HP’s existing workstation line—especially for video editors and streaming producers. With hyper-threading technology enabled on the HP Z800, you get 16 cores on a dual-processor, quad-core Intel Nehalem system. Rhozet Carbon Coder got them all working, too. (April 1, 2009)

About Jan Ozer

Avatar photo
I help companies train new technical hires in streaming media-related positions; I also help companies optimize their codec selections and encoding stacks and evaluate new encoders and codecs. I am a contributing editor to Streaming Media Magazine, writing about codecs and encoding tools. I have written multiple authoritative books on video encoding, including Video Encoding by the Numbers: Eliminate the Guesswork from your Streaming Video (https://amzn.to/3kV6R1j) and Learn to Produce Video with FFmpeg: In Thirty Minutes or Less (https://amzn.to/3ZJih7e). I have multiple courses relating to streaming media production, all available at https://bit.ly/slc_courses. I currently work as www.netint.com as a Senior Director in Marketing.

Check Also

Steve Strong Details Norsk's Suitability for Broadcasters

Norsk for Broadcasters: Interview with id3as’ Steve Strong

Recently, I spoke with Steve Strong, id3as’ co-founder and director, about Norsk’s suitability for broadcasters. …

Five star review for video quality metrics course.

New Five Star Review for Video Quality Metrics Course

The Computing and Using Video Quality Metrics course teaches encoding pro to compute and use video metrics like VMAF, PSNR, and SSIM.

Figure shows the different components to live streaming latency.

The Quality Cost of Low-Latency Transcoding

While low-latency transcoding sounds desirable, low-latency transcode settings can reduce quality and may not noticeably …

Leave a Reply

Your email address will not be published. Required fields are marked *