WAV vs. FLAC vs. AIFF


Hi, has anyone experience any sound quality difference between the three formats? Unfortunately I been using only the wav lossless formats. I have no experience with the other two. If you have experience the three, which one do prefer and why? Thanks and happy listening
Ag insider logo xs@2xhighend64

Showing 4 responses by almarg

Although in principle any half-way decent modern computer should have no trouble de-compressing and processing any audio format, my suspicion is that a major reason for reported sonic differences between lossless formats relates to the fact that in modern computers the clock rate, voltage, and power draw of the cpu chip are usually dynamically varied as a function of the processing requirements at any instant of time. That is done for purposes of minimizing power consumption and heat generation, and in the case of laptops to prolong battery run-time.

That switching involves changes in current that are both large and abrupt, which can be expected to cause significant noise transients to propagate through a lot of the circuitry on the computer's motherboard. That in turn can be expected to contribute to jitter, or even outright mis-clocking and breakups, on the signals that are used to output the audio data.

See my post here for a description of how to disable that switching in Windows 7. That change did in fact resolve the audio breakup problem the OP in that thread was having. Some computer BIOS's also allow "EIST" (aka "Speed Step") to be disabled, which may accomplish the same thing. I'm not particularly familiar with Mac's, but I believe that third-party software might be needed to do this.

The sonic significance of all of this will obviously be dependent on the particular computer that is being used, on what kind of output is being used (USB, S/PDIF, Ethernet, etc.), and on the jitter sensitivity of the component to which the signal is being sent. Ethernet and wireless presumably have little if any sensitivity to these issues, because of the packetized and buffered nature of the data transmission.

None of this necessarily correlates with the resolution or quality of the audio system.

It would be interesting to know if those who report sonic differences between these formats perceive the same differences when these power conservation features are disabled, and the cpu is running at the same speed and voltage all the time.

Regards,
-- Al
Excellent comments by all, IMO, on an issue that by its nature is highly speculative.
11-23-11: Dtc
I agree that networked solutions can provide better isolation that direct connections. Remember, I am not talking about audio streams in general, but the difference between FLAC and WAV files. I am not willing to say that computers routinely make computational errors when compressing and decompressing FLAC files and therefore WAV files are better. If people think they hear a difference, that is up to them. But I have yet to hear a detailed explanation of why that happens that makes sense.
What about my hypothesis, that differences in the processing that is performed when playing the different formats result in differences in when and how often "Speed Step" and related power conservation features are called into play (unless the user goes through the steps that are necessary to disable those features), in turn resulting in significant differences in computer-generated noise transients, in turn resulting in differences in jitter and/or noise coupling?

Even if an asynchronous USB DAC is being used, conceivably high frequency noise transients riding on the USB signal pair and/or the associated power and ground lines could couple past the DAC's input circuits to internal circuit points, where they could affect timing of the DAC chip itself, and/or subsequent analog circuit points. Galvanic isolation would help in that regard, as you noted, but it is not always employed, and who knows how effective it is in any given situation?

And then there is the possibility, perhaps somewhat more remote but conceivably still possible, of differences in rfi resulting from those format-sensitive noise transients, the rfi perhaps bypassing all of the digital circuits that are involved and coupling onto sensitive analog points elsewhere in the system.
11-22-11: Mapman
But the format itself does not correlate to sound quality in general though. Lots of other crap can go wrong and chances are it does so differently because of different hardware and software processing scenarios for different formats. The devil is all in the details. But not in the source format itself. If processed properly, the results are the same. That can be a big if though.
11-23-11: Mapman
Most general purpose computers have no business being connected directly to your high end audio gear! Think of this [network playback] as a form of isolation, similar to other steps you might take to isolate your rig from potential sources of noise.
Well said! Agreed 100%.

Best regards,
-- Al
DTC, thanks for the good response, with which I am in essential agreement.

I would just like to make sure it is clear to everyone that under my hypothesis cpu utilization which is low but non-zero may actually be WORSE with respect to noise generation than, for instance, 100% utilization would be. The noise transients I am envisioning are associated with the abrupt SWITCHING of cpu clock rate, and in some cases voltage as well, that unless disabled by the user will occur as processing tasks intermittently start and stop.

That switching involves LARGE changes in cpu current draw, which happen quickly, although I don't know exactly how quickly. Current changes that are both large and fast = large noise transients.

For those who may be interested, utilities such as the Windows-based program CPU-Z allow those changes in clock rate and voltage to be observed as they happen. It should be kept in mind that cpu current draw is highly dependent on clock rate.

Happy Thanksgiving to you and yours!

Best regards,
-- Al
Steve,

I don't necessarily disagree with any of your comments, and I don't assert that the hypothesis I offered is anything more than speculative, but the "huh" in your last response tells me that my hypothesis may not have come across clearly.

The OP in the thread I linked to early in this thread was using a newly purchased Windows 7 laptop, and experiencing severe distortion, and also intermittent skipping, when outputting audio via USB into a DAC. The same setup had worked fine previously, with a different laptop running XP.

The problem was fixed when at my suggestion he changed the power management settings within the Windows 7 control panel such that the MINIMUM (as well as maximum) "processor state" was set to 100%, instead of the default 5%.

That change in effect disables SpeedStep, causing the cpu to run at its maximum speed all the time. As I say, it fixed the OP's problem with distorted USB audio. Therefore it seems to me to be at least a semi-plausible hypothesis that SpeedStep could, with some computers and in some setups and with some DAC's, cause noise and/or jitter issues that would be sensitive to processing requirements, and therefore conceivably to data format. Particularly if those processing requirements load the cpu lightly and therefore intermittently.

Again, that is just a speculative hypothesis, but in the absence of evidence to the contrary one which seems to me to have at least some degree of plausibility.

Best regards,
-- Al