Analytics shouldn’t cost an arm and a leg.

Okay, we all know it’s annoying when a video suddenly stalls to buffer or its quality drops because of a weak or unstable connection, even if that only lasts a few seconds.
For streaming platforms, these moments of dissatisfaction matter. They need to know how often such playback ‘hiccups’ occur across their audience and find effective ways to reduce them, improving what’s known as Quality of Experience (QoE).
Knowing is half the battle, let’s do the other half
The goal is to find the proper clues and tools to truly understand viewer experience, without getting lost in the details. How?
A. Use a platform that gives you homebrewed analytics
Tempting, but potentially lacking. The big boys won’t share QoE data with you (although they collect it behind the curtain) and you’re only left with viewership, engagement etc. The excuse – only partly valid – the likes of YouTube do their utmost to ensure unparalleled QoE as it’s in their best interest.
B. Use a 3rd pary tool for analytics
Mux Data is (afaik) the most used by far. It’s comprehensive and has been around for a long while. It’s free to start with but gets ridiculously expensive as you grow.
I was sad when Streamlyzer stopped offering the similar service as it was (at the time) the only healthy competition.
Outliers include Bitmovin, api.video, NPAW – to name the ones (I heard of) that (still) offer it as a separate service – i.e. let you bring your own platform and player.
Need to say here, and it’s been a problem for many of my customers: all these solutions are not cheap and include a ridiculous amount of extra features. If you just need the basics (which is most of the time the case), you end up paying for breadth, and the pricing feels inflated.
C. Do it yourself
How hard can it be? 🙂
- Most players already dispatch events (or otherwise provide hooks) when they start to buffer, play, switch qualities etc.
- Invent an add-on that grabs all that info and sends it to an aggregator.
- [Secret sauce] Make up, and refine, a QoE score formula (similar to how Mux does it) out of the most relevant playback behaviors – how fasts it starts to play, how often and how long it buffers, the quality rendition it’s been playing at, etc.
- Apply that formula to hoards of players, group by customer, stream, geographic region, time of day etc
- Present the data in a nice format – graphs, pies, burgers
Because we’re poor, we’ll be leaning toward the latter.
Long solution short
The demo below shows you the metrics in use and the score that comes out of them, both for the video you’re just playing and the aggregate of others.
It’s also free to grab and build upon here; and it includes the lately no‑longer‑secret QoE score formula. While it is simple, the coefficients have been fine tuned to near perfection over many real video sessions watched by real humans 🙂
Not simple?
Ok, I’ll explain. Think of it like starting from a perfect 100 and subtracting penalties:
- Bad buffering share hurts the most.
- Slow start hurts next.
- Frequent buffering hurts less.
There are more factors that contribute to the quality of experience – notably the quality rendition you’re watching at and frame loss – still we’re finding that the 3 above make up for more than 90% of the whole deal. The remaining 10 is harder work and not worth it unless you’re huge.
Does it scale?
It does. Easily up to some 1M simultaneous viewers, some basic sharding and autoscaling may be needed for larger numbers and serious fluctuations.
Is it really free?
🙂 I know, right? I firstly threw this together for a buddy bleeding cash on Mux, even with their “sweet” private plan. Meanwhile, variations have been running in production on 3 medium size platforms, almost without a glitch.
Answer: Not completely free, just no more than hosting a simple web app.
Why bother?
Quality of experience is important – I’ll assume nobody argues with that. But beyond, a platform may need to ask itself:
- How far are we willing to go to improve the experience of viewers in (very) challenging network conditions?
- In the case of user-contributed content, how able and willing are we to pinpoint their shortcomings (bad network, unfortunate encoding parameters, encoding overload, etc.) and efficiently communicate with them to address these?
If the answer leans toward elevated QoE, a solution based on this is trivial to implement, and the unidimensional score makes it easy to pinpoint outliers. Btw, a score of 9+ is great and 8+ is good.















