The graph below shows two off-the-shelf players retrieving the same HLS manifest under the similar test conditions, which vary from 512 kbps at the low end to 5120 kbps at the top end. On the left, Player A, stopped retrieving video multiple times during the low bandwidth transmissions, stalling playback, which are the red lines at the bottom of the graph.
Player B, on the right, downshifted smoothly to the appropriate ABR stream, and playback never stopped. These tests were repeated eight times over the course of a day with similar or identical results.
So, which player would you rather deploy, A or B?
By way of background, I deployed these tests for a client in the midst of a system upgrade. Rather than assuming that each player performed as advertised, the client paid me to test and see.
And that’s the point. We all know that QoE is paramount and that multiple factors contribute; encoding, delivery infrastructure, the player, and perhaps others. But unless you test and monitor each one, QoE can suffer.
DIY Player Testing
In truth, player testing doesn’t have to be exotic. The protocol I follow is to introduce a series of bandwidth adjustments via Charles Proxy or Google Developers Tools to determine how smoothly the player adapts to the changes. Using Google Developers Tools, I can see the actual segments and M3U8 files retrieved by the player. I record the tests with Camtasia to process in non-real time to monitor playback breaks like those shown above, but also to determine:
- How quickly the player adapts to the proper stream. Some players drop to the lowest quality stream upon detecting a bandwidth change, even if the change is minor. This subjects the viewer to a very-low-quality experience for a segment or two for no reason. Other players cycle through multiple layers before finding the proper one, which is very visible to viewers, and also degrades QoE.
- How many manifest files the player downloads. Some players download a manifest file or two with every fragment, which is very inefficient. Better players only download a manifest file when changing stream quality.
Typically, I’ll test competitive services at the same time to gauge competitive performance, which can be an eye-opener. I may also test player load time and playtime tests to measure how startup latency.
Some clients engage when they are considering changing their player; others to benchmark their own performance, or learn how they compare to competitive services. Sometimes it’s in response to customer complaints; sometimes an attempt to avoid them.
None of this is rocket-science, and you certainly can perform this type of testing yourself. The point is, you should definitely test player performance, whether internally or externally. Don’t assume your player is working well because unless and until you test, you never know for sure.