It gets somewhat complicated, but the issue has to do with tolerances. In other words, a company/manufacturer can calibrate the meter to a particular combination of LEDs, but the accuracy will diminish as you move away from that spec. The upside is that LEDs are actually, in many respects, a more homogeneous lighting technology than what has been used in the past. They still have unit-to-unit variation and manufacturer (of the LEDs and spec) variation, but you don't get as much batch-to-batch variation as you did with things like phosphor (CRT, Plasma, and to an extend CCFL). So, you essentially get a calibration that is more accurate for the technology than using something not dedicated to the technology, but you suffer from wider variances in what manufacturers might put into the display, if that makes sense. Since there are very few manufacturers out there making the necessary blue and "white" LEDs, that tends to constrain the space of what an OEM might actually put into the TV.This one has always perplexed me a little... different displays presumably have a choice of what LEDs they use, as such, how does a single profile matrix make a colorimeter accurate enough across the range of possibilities?
Net-net: figure out what the reference display is, and then know that as you drift beyond that manufacturing "group" (e.g., Samsung/Sony/JVC), you might get increased error, but not necessarily enough to invalidate the calibration (LEDs are tough, but doable). To be really sure about things, that's where the $$$ meters come into play, or contracting for a specific calibration for the meter.