Originally Posted by Tjj226 Angel
The simple answer is it does. For every 3db of SNR, it makes a whole lot of difference, so something that has 124 SNR is going to be a whole lot better than the 116 SNR.
Maybe in comparison to each other, but in a real life scenario, where a 16-bit record has the quantization noise of -93dBFS, both of these systems would be audibly transparent to a CD-quality source. If you run your DAC without digital volume control, and deliver the full scale signal, you should get the reported dynamic range/noise floor. Enormous noise floors only matter if you run a digitally attenuated signal, which you then amplify with a pre-amp, which is a silly practice anyway.
So no, the SNR in modern gear doesn't really matter in conventional listening scenarios. Furthermore, manufacturers lie. For instance, the measurement dBA means frequency-weighted sound pressure, which means they're omitting something from the measurement. What is there to hide? Second of all, as stated earlier in lab conditions, EMI/RFI free environments with lab-quality power supplies, gear can work a lot better (implying a bad PSRR in the developed circuit, if it only works with super high quality PSUs). Furthermore, sometimes manufacturers only state the performance of the CHIP (op-amp, or DA-chip, not the whole circuitry), which can be completely misleading in relation to the whole circuitry and it's design choices.
Don't go by manufacturer specifications, and don't think these humongous signal-noise ratios matter at all. Spend your money elsewhere.Edited by seepra - 12/8/12 at 9:25am