Why 75dB? I think it's more important to start somewhere where compression becomes a legitimate concern. If the driver is moderately efficient (mid 80's), compression isn't of concern until you're at least there. So, to me, 75dB testing is moot.Thats great, we have to show consistency and that is why you should start measuring at 75db. That is what everyone else is holding to at REW forum, unless there is a house curve or for measuring for compression.
I have a very fundamental issue with the talk around compression testing here. What's catching me off guard here is people saying to test compression by increasing the output to achieve 10dB steps. This is flawed. Compression shouldn't be solely focused on the FR. Compression is a measurement of input vs output. It is not a measurement of response as you increase the output itself. For example, if you increase the input voltage at a speaker's terminals then you should get that same relational value in dB output by the DUT. Anything less is due to the effect of compression."While 75db is the default for REW it is hardly a standard per se. There has been a lot of emphasis over the years on 75db being a standard of some sort but it is really just a reference guideline for level setting speakers. "
Right, so stick with it! If you started at 75db then you could go up in 10 db increments and have this reapeatable for all of your measurements. Would this not make sense from a sciencetific/reference standard.
Looking at your graphs of the 5 woofers that you measured there are no target lines. If you are not measuring at 75db you then need to state the target level for the measurenent taken. IE: on the graph for the Emotiva where would you place the target level? And in the graph for the SVS is the target level 116db? If so you missed it by 20db!
Yes, you are looking to see how the response changes but what's the point if a driver is so inefficient at higher output that you're having to feed it twice as much as you should for the same output?
Consider this: What happens if you're increasing SPL by 10dB but your voltage ratio from your previous input voltage of 4v to your new voltage input, which should be 12.6 volts, is actually 14.6v? All you've done is increase the output but you've failed to acknowledge the 2v loss in your test, which equates to about 14watts or 6dB! Yikes! So, yea, the curve at the reference frequency increased by 10dB, but it's not illustrating the fact that you just had to make up more than 6dB by turning the amp gain up higher and higher.
This is what compression testing should tell you. Again, voltage in vs SPL out vs what should be there. Then you get the FR curve, but most importantly, you get to see how efficiently the driver is able to use the power provided to it.
Hope that makes sense. Maybe my assumption on how this kind of testing is being performed is wrong so please correct me if so.
- Erin
Below I've attached a picture of compression testing I did on a Seas w18nx driver. This is the 20-110hz band. As long as the lines are stacked on top of each other, what you're seeing is what is expected; no loss in output vs voltage input. Where the lines deviate, is an indication of how much output (in dB) is lost with the input vs the initial voltage vs frequency. As you go higher in frequency in the chart, you can see a loss of about 0.4dB from 1v to 8v input.

This is the same thing but from 400hz to 6khz. You can see a loss of about 0.8dB at 3800hz. Likely due to inductance issues (yet to verify).
