All products featured are independently chosen by us. However, SoundGuys may receive a commission on orders placed through its retail links. See our ethics statement.
Here's why headphone measurements are harder than they look: what happens when we mess up
August 1, 2025

We measure a lot of audio products at SoundGuys, and sometimes things don’t go as planned. While we make every effort to get headphone measurements right before publication, the truth of the matter is that we’re still human, and you don’t always get the results you expect. When that happens, here’s what we do to make sure everything is on the up and up.
Step 0: Prevent mistakes before they happen
Before we test a product, we have to make sure that the test is fair and that we get the results we’re looking for. However, it’s a bit of a challenge. When you use a stand-in for a human head to measure audio, there’s all sorts of things that can go wrong with a measurement — especially when you use test legendarily finicky fixtures like the Bruel & Kjaer 5128. After all, a hunk of rubberlike material and metal can’t tell you how it feels, if it’s uncomfortable, or if its ears are caught on something. Instead, we have to rely on measurements, protocols, and test signals to tell us what we want to know.
After all, the whole point of the exercise is to measure a device in a way that fairly represents how it might perform on a human. Products aren’t always designed to address every little possibility, so sometimes measurements reveal more about how something can fail than how it performs in completely perfect settings.

By running pre-prepared sequences to help us get a good measurement, we can learn whether there’s a busted seal between the ear pads and the head, whether earphones are inserted correctly, or if we’re wasting our time with a faulty unit. But if you’re thinking hard about what could go wrong, it’s probably occurred to you that there are ways that issues can squeak by instrument-only checks. For example:
- Maybe there’s something wrong with the particular unit, but it doesn’t measure poorly.
- Maybe the fit is good, but the positioning is off.
- Maybe the pads of certain headphones struggle to make a seal because the test head has no body heat, and takes a lot longer for the viscoelastic or memory foam to relax.
- Maybe wear-detection sensors aren’t fooled by the test head, disabling certain features without any outward signs that’s happened.
- Maybe a firmware update changes the behavior of the product.
While wired headphones are very easy to test because they don’t tend to have very many advanced features, wireless products are often stuffed to the gills with them. Many of these features rely on sensors or microphones to tell them when they’re being worn, and that added wrinkle can derail any test session. Because headphones aren’t sentient, they rely on programmed logic to determine when they’re being worn to avoid excess power draw and cause unintended performance issues. But this often depends on some assumptions that aren’t always true. Common examples include:
- Humans are capacitive, so if the inside of earbuds can complete a circuit, it means they must be being worn by a person.
- When headphones are worn, no light hits the inside of the ear cup, so a light sensor detecting no light definitely means they are being worn.
- Humans always have skin and blood vessels on their ears, so a sensor that detects these things will also mean a person is currently using them.
- Humans will only turn a product on once they’re wearing it, so it’s okay to run a calibration tone immediately at startup to calculate filters for the user.
There are two main problems with these approaches:
- They don’t account for when the sensor fails somehow.
- They don’t leave any room for the underlying assumption ever being wrong.
Despite how many times this old man has yelled at clouds over this, manufacturers still make headphones and earbuds that don’t plan for potential failure. It’s tempting to think that because this funky robot head is made from averaging MRI scans of 40 people to recreate human anatomy, it’s a perfect recreation of what a human might be like. We have to reconcile with the fact that the test head isn’t made of living flesh and bone; it’s merely an analog for a human one — and there are some key differences we have to account for. While nobody is making their products for a robot to listen to, we still have to make these things work with one. Especially with more complicated do-it-all mass-market products, it’s not always easy to tell when something’s gone awry. And that leads us to step 1.
Step 1: Diagnose mistakes when they happen
Usually, measurement errors become evident immediately after using the product, as we’re all used to seeing measurements, and we’re familiar with what they mean for our music. For example, a rolloff in the bass might mean a busted seal, or wild channel balance issues might mean mismatched insertion depths for earbuds. We can see quite a few things on a chart that scream, “Test it again, dummy!” But when our test results broadly match what we’re listening to in real life, sometimes smaller issues slip through the cracks.
It happens to anyone who measures headphones. Yes, even your favorite competing outlets or YouTubers. All the time. It’s just a fact of life when you generate rainbow headphone squiggles. Every outlet has a strategy for dealing with this, but there are sometimes tradeoffs.

By going through a number of diagnostic steps, we can learn more about how a product might be having difficulty with the test head and address them. Most of the time, this stage is where we catch the extremely few errors that made it past Step 0. But sometimes we need to call in help to fix things. If something looks really weird to us, or we just can’t figure out how to solve something, we move on to step 2.
Step 2: Talk to the manufacturer
If we think something’s wrong with the headphones or the measurements are more difficult than they should be, we email the manufacturer to see if our results align with what they’re expecting. Most of the time: nothing’s out of the ordinary. But nobody is more familiar with the expected performance of a product than the engineers who made it, so we periodically ask for their help in figuring out if something’s up.
For example, we talked with Sony about the WH-1000XM6 before publication because our measurements showed us something a bit different from what we heard. Once we had diagnosed everything that had gone wrong (and this one was a doozy), the final measurements taken were much more realistic. Reading other reviewers’ experiences at competing outlets after the fact confirmed that we weren’t the only ones to run into issues here, so it’s worth pointing out that you should be looking at several sources for measurements if that’s your jam.

Should manufacturers think our measurements are off, they might send us a new unit while requesting the original back. This has happened several times before, but on the whole, it’s fairly rare. Maybe a few times a year, we run into a product that requires a second unit. Once we’ve purchased or been sent a new copy, we retest and compare. Sometimes this even results in a product’s firmware update to solve issues that we’ve revealed — it’s happened a few times now!
Step 3: Check others’ work
Of course, the manufacturer doesn’t always respond, and sometimes our results show a significant departure from what others are posting. It’s important to remember that we’re not testing tens of samples per product, but rather one copy of the device under test most of the time. Sometimes there are characteristics specific to that particular copy, and no amount of finagling or messing about will bring your measurements completely in line with others’ results. If we see that we’re an outlier in coverage, we retest, see if we can secure more units, and attempt to diagnose what (if anything) went wrong.

Because we cover headphones as news instead of enthusiast-oriented content — and the fact that audio as a category moves very slowly compared to other tech sectors — we often have measurements available before anyone else does. So it can take a little while before we can check our work against others’. This is especially true in cases where a model isn’t released in the US, but is available in Canada, like the Bose QuietComfort Ultra Earbuds (2nd Gen). But now that more and more reviewers are using similar fixtures to ours, it’s much easier to see where there might be an issue and diagnose what might be happening if we come across confusing results.
Step 4: Issue updates
Say the worst happens and even after everything we’ve done: a junk measurement makes it to publish. What then?
This happened just recently with the SteelSeries Arctis Nova 3X Wireless. The first unit exhibited some distortion issues in Multi-Dimensional Audio Quality Scores, but there weren’t obvious signs that something was wrong with the product we had at the lab. It performed about what we expected for an inexpensive headset, and signs of a poor coupling were completely absent. There weren’t any sensors to defeat, and we could get repeatable results with the headphones. After a back-and-forth with the manufacturer, they sent another unit. We tested the new unit with the exact same protocol, and we published the results from the second copy, which were far closer to the expected performance. Sometimes units are just duds. After going through the above process, SteelSeries had this to say:
Based on the results that SOUNDGUYS achieved in their initial review of a sample unit, we purchased a Nova 3 Wireless from a local retailer and conducted testing at the SteelSeries Sound Lab to confirm that it adhered to our production and sound quality standards, then provided that unit to SOUNDGUYS to run their own independent, unbiased tests… We want to thank SOUNDGUYS for the open dialogue and collaborating to achieve the highest quality results.”
Manufacturers are generally pretty motivated to see how things can go wrong because it’s possible that a manufacturing process is out of spec or that quality control processes missed something. They also want us to have accurate measurements because if we don’t, then the people who buy these products might get something completely different from what they expect. Consequently, representatives from these companies often pay very close attention to what we publish, and we’ll definitely hear about it if something isn’t quite right.
...we approach our past coverage as a living document: if new information comes to light, it must be updated.
Given that we built SoundGuys in the tradition of journalism, we make every effort to prevent errors from reaching publish — but we also believe that having a way to fix them should the worst happen is also essential. It doesn’t help anyone if we double down when confronted with evidence that we screwed up and refuse to try to correct it. Instead, we approach our past coverage as a living document: if new information comes to light, it must be updated!
We just want you to be able to get headphones you like, and the first step in that process is getting you reliable information to use.
Thank you for being part of our community. Read our Comment Policy before posting.