I recently interviewed David Ronca, who you probably know from his work at Meta and Netflix, but who’s now running his own company, RoncaTech. We discussed the upcoming release of VCAT (Video Codec Acid Test), a tool that benchmarks video playback performance on Android devices.
By way of background, I reviewed an earlier version of VCAT for Streaming Media Magazine. Since then, the workflows and architecture have been rebuilt to accommodate Android’s evolving security and power management constraints. VCAT comes in two parts: the Android app, which runs on the target device and performs the tests, and VCAT Web, a desktop companion that connects to phones over USB or Wi-Fi.
VCAT Web lets you run tests on multiple devices simultaneously, provides richer data and better visualization (since it’s not subject to Android app sandboxing), and offers a convenient way to push or pull files to and from Android phones. In this interview, Ronca focused on the app itself, how it works, who it’s for, and where it might go next. You can read and see more about VCAT Web in my review.
Contents
What VCAT Is, What It Does, and Why It Exists
I first interviewed Ronca about VCAT at Mile High Video, and you can read the origin story in that Streaming Media article. He revisited those points at the start of our conversation, with a focus on how the tool evolved and why it’s needed.
“VCAT stands for Video Codec Acid Test. And to be specific, it’s a video decoder,” he explained. “The acid test is pushing [something] to its absolute limits to make sure that it can actually stand up and withstand that test condition.” VCAT doesn’t literally stress test devices to the edge of failure. Still, it does collect telemetry under playback conditions that matter, especially for streaming to mid- and low-tier Android devices, where issues like battery life, thermal throttling, and dropped frames can significantly impact user experience.
The need became clear during Meta’s AV1 rollout. “We first rolled out AV1 on iOS, which was really low-hanging fruit,” he said, referring to the fact that Apple’s comparatively expensive phones all have sufficient power and battery life to efficiently play AV1 in software. “Then we started expanding into Android. And if we targeted the most powerful [devices], it was really an easy problem. But Meta serves 3.5 billion users, and most of those devices are not high-end.” Those interested in the testing performed before distributing AV1 to Android phones can check out How Meta brought AV1 to Reels, which details the process.
At the time, the team was relying on overnight manual testing, which was inefficient and time-consuming. “It just came to a point where I said, there has to be a better way.” VCAT was born from that moment, backed by Meta and built in partnership with Ittiam.
Since my original review in Streaming Media, VCAT has been rebuilt to address modern Android security and power constraints. The first version relied on VLC Benchmark for video playback, while the new version has moved entirely to ExoPlayer. “Most companies that have video apps are using ExoPlayer. They’re not using VLC,” he said. So, he felt VCAT would more accurately represent real playback conditions if it used ExoPlayer as well.
As Ronca retired from Meta, ownership of VCAT transitioned as well. “We had a conversation with Meta, and they just felt like, because of the nature of the app, it’s not really something they wanted to support. So VCAT will be released by RoncaTech. It’ll be on GitHub. It’ll be on the Play Store. It’ll be GPL-2, free for commercial or noncommercial use, unless someone wants to distribute private forks, in which case we’d talk.”
Who is VCAT designed for? Ronca sees interest from platform developers, handset vendors, SoC vendors, and even carriers making bundling decisions. “Perhaps they’ll start getting a scoring system for video,” he said. “Today, you wouldn’t know. You look at two $90 phones, one with modern SoC architecture, one with 10-year-old tech, and they look the same. VCAT helps you tell the difference.”
That brings us to the hands-on demo.

Installing and Launching VCAT
Ronca kicked off the demo with a fresh install of VCAT on a $90 Motorola E14 Android phone. As expected with any performance monitoring app, the first step was enabling permissions for system settings and file access.
Once launched, VCAT opened to a clean home screen (Figure 1): no test vectors loaded yet, device name and technical details displayed at the top, and navigation tabs along the bottom.
Loading Video Files and Playlists
The first step of any test cycle is loading the video files and associated playlists, called Test Vectors, which you do in the Test Vectors tab (Figure 2). Some files and playlists will be made available with the program, and/or you can use your own files.
During the demo, Ronca pulled in a few AV1 and VP9 videos hosted on an S3 bucket, along with the matching playlists that control how those files are played during a test. “You don’t have to do it this way,” he explained. “You can sideload files using the Android Debug Bridge (ADB), or just download them from Google Drive, whatever’s easiest. Once they’re on the phone, you can build your own playlists in seconds.”
The version he demoed included the same AV1 and VP9 clips used for Meta’s internal battery drain testing, up to FHD video encoded. Once downloaded, both the media files and the playlists are available inside the app, ready to run.

Managing Playlists
Downloaded playlists appear in the main interface (Figure 1). From there, users can play, edit, or delete an existing playlist or create a new one using files already stored on the device. “If you’ve got media files on your phone,” Ronca said, “you can create a new playlist in just a matter of seconds.”

Configuring Test Conditions
Before running your tests, you’ll want to set your test parameters in the Run Conditions tab. There, you can set screen brightness, decoder thread count, and the Run Options. These include running the playlist once, running until the battery drops to a defined threshold (15% in Ronca’s demo), or running for a fixed duration.

“This is actually my favorite test,” he said, selecting the battery threshold option. “Loop playback until the battery hits 15%. That tells you a lot about efficiency.”
Selecting a Decoder
Ronca then demonstrated the decoder selection interface. For each codec, AVC, HEVC, VP9, AV1, and soon, VVC, users can choose from a list of available decoders on the device. These might include VCAT’s bundled software decoders (like DAV1D), chipset-specific software decoders (such as Unisoc), or hardware decoders exposed through Android’s codec APIs (Figure 5).
“If you’re benchmarking AV1 software, you might compare the bundled decoder with the chipset decoder and Android’s own DAV1D build,” Ronca explained. “You can run all of them and see which one gives you the best battery and thermal profile.”

Running a Test on VCAT
Once all these options are set, you click Play to start the playlist. During the demo, Ronca ran a simple test using a 360p AV1 file, set to loop until the battery reached 15%. “You can leave it running overnight, unplugged, and it’ll stop when the battery hits your threshold,” he said. “And if you’re benchmarking something like DAV1D or the Unisoc decoder, this is where you see the difference in power draw.”
Though not mentioned during the demo, VCAT also has controls that let you pause and resume testing in mid-stream, which can be particularly valuable during a 12+ hour test. Any pauses and resumes are noted in the results file for tracking purposes.

Figure 6. Click Play to start the test run.Once playback started, Ronca tapped the screen to display an information overlay showing key telemetry: resolution, decoder, codec, and frame rate (Figure 7).

Viewing Results
After completing playback, Ronca switched to the “Reports” tab to show the test summary. One of the saved tests had run for nearly 11 hours, with the battery dropping from 100% to 35% while decoding 720p AV1 using the Unisoc decoder. This is a new screen that wasn’t available during my beta review, and it’s a great addition.

Exporting and Analyzing Telemetry
The summary view highlights system temperature, battery temperature, and frame drops, but that’s just the surface, as extensive data regarding each test is recorded in a CSV file stored on the device. “If you want the detailed telemetry,” Ronca said, “you pull the CSV and analyze it offline. There are about 40 columns, which include core frequency, battery draw, memory usage, decoder type, resolution, and frame rate. If you’re benchmarking AV1 across multiple decoders or testing thermal performance on low-end phones, this is how you make sense of the results.”
Beta Release and Availability
How can you get your hands on VCAT? Ronca explained that the app will be released as a private beta available by email invitation only via the Google Play Store’s testing system. Interested users can sign up through a Google Form that Ronca will post on LinkedIn. Once added to the test group, participants will receive an official Play Store email with a direct install link for their Android device. VCAT will be licensed under GPL-2, free to use in both commercial and non-commercial settings, with redistribution or proprietary builds requiring a separate conversation.
Current Perspective …
Then Ronca circled back to why the tool exists in the first place. VCAT isn’t just about debugging playback on one $90 phone. It’s about building a foundation for understanding video performance across the fragmented and unpredictable landscape of Android devices. “Today, you wouldn’t know,” he said. “Two phones can look identical on paper, but one performs dramatically worse.”
Ronca drew an analogy to VMAF (Video Multimethod Assessment Fusion), a widely adopted video quality metric developed at Netflix and championed by Ronca. He reflected that VMAF brought a new level of precision to video quality assessment, a benchmark that the industry could widely agree upon and rely upon. Ronca envisions VCAT playing a similar role for decoding performance on diverse Android devices, becoming an industry standard that helps align expectations and benchmarks across the board.
Regarding VCAT’s revision, Ronca was careful to acknowledge the original work. “I’m not in any way dissing the work that VideoLabs did,” he said. “The benchmark tool; everything was beautiful. VLC is phenomenal.” He explained that while the previous model had been maintained for nearly six years, changes in the mobile ecosystem made it impossible to carry forward. The new version is a clean break, but built with appreciation for what came before.
,.. and Future Directions
Ronca outlined several possible future tools under the VCAT banner: VCAT Encoder, which would benchmark encoders based on power consumption vs. compression efficiency; VCAT GPU, which would evaluate GPU load; and NPU VCAT, which would test Neural Processing Units (NPUs) performance under typical AI workloads. Each of these tools targets a different part of the stack, but shares a similar goal: providing the industry with tools to understand what low-end devices can and can’t do, before poor performance reaches the user.
In particular, Ronca believes that VCAT Encoder could be valuable to any company that captures and transcodes video on-device, including platforms like Meta, YouTube, TikTok, or even handset vendors themselves. “If you’re generating video and making encoding decisions on a device, this would be very useful information,” he said.
He noted that hardware encoders like HEVC can sometimes underperform software options like x264 veryfast or SVT-AV1, depending on the device. VCAT Encoder could make those tradeoffs visible, especially when comparing compression efficiency against power draw. As for cost, Ronca floated the idea that such a tool could be built with external sponsorship, similar to how Meta initially funded VCAT. “We’d love to talk about a possible sponsorship,” he said. “This kind of thing would serve the entire ecosystem.”
Not Easing into Retirement
He also reflected on the unexpected shape of his retirement, which, so far, has involved a lot more coding than relaxing. Rewriting the tool from scratch, parting ways with the legacy VLC-based benchmark framework, and getting to this point took more effort than expected. Still, he seemed energized by it. He thanked his collaborators and expressed hope that developers will not only use the tool but also contribute to it. “That would be the real proof,” he said. “Not just people using it, but making it better.”
Speaking personally from my experiences with VCAT, I can say that anyone benchmarking Android devices for any reason will find it invaluable. Given the importance of video playback to smartphone customer satisfaction, it’s not a huge leap to predict that VCAT will quickly become the performance lingua franca for all those in the video value chain, from chip developers to content publishers, and every company in between.
Streaming Learning Center Where Streaming Professionals Learn to Excel
