WAV vs. FLAC vs. AIFF


Hi, has anyone experience any sound quality difference between the three formats? Unfortunately I been using only the wav lossless formats. I have no experience with the other two. If you have experience the three, which one do prefer and why? Thanks and happy listening
Ag insider logo xs@2xhighend64
""Malfunction"? Are you saying that the 16/24 bit data for each sample is incorrect? That the data that comes out of a flac decode is different that the wav data?"

That is exactly the point. There are many reasons why the data in each format might be different even if originating from the same CD. Anything can happen with computers and their programming at any time and often does. But the one thing that is not different is the ability of each format to store the exact same digital representation bit per bit.

That is why the format is not the issue, rather the issues may occur with everything that happens both during the rip and during playing/streaming of the digital data stream at each phase of processing prior to hitting the DAC and being converted to analog.

So the bottom line is that each format may result in different decisions being made in terms of how to minimize the risk of all the gadgets involved in ripping and playing doing more harm than good. Network audio streaming I agree is one of the simplest, least expensive and practical ways to help accomplish this.

Most general purpose computers have no business being connected directly to your high end audio gear! Think of this as a form of isolation, similar to other steps you might take to isolate your rig from potential sources of noise.

Also think of network players as a specialized type of computer that is designed to stream audio effectively to your rig. Although this is still an emerging audio solution, it is one that lends itself well to solving the problems using technology that is readily available and affordable TODAY.
The idea that the FLAC decoder produces wrong numbers is just not credible. People have repeatedly shown that the compression/decompression algorithms works. And, computers very, very seldom make computing mistakes. If each time you opened a spreadsheet it produced different results, people would not use them. If there is one thing that a computer can do it is do computations correctly. If people think that the computer is regularly doing the FLAC computations incorrectly and in a random manner, then I would love to see some actual proof of that. I just do not think it happens.

So, others issues for audio seem to be electrical noise and timing. Electrical noise, for example grounds, can potentially be an issue. That is why people are building galvanic isolation into higher end devices - to break the electrical connection between the PC and the DAC. Of course, electric noise is also present in network players, it just is not tied to the PC.

That leaves timing. Digital audio depends on precise timing of each sample. Before aysnc USB, the timing was problamatic and jitter was a real issue. That is why I keep coming back to aysnc USB. If it works are advertised, the jitter should be very low and independent of the source format. If someone can explain why the source format processing influences the final timing in a aysnc USB device, then I am all ears. I admit to not knowing the exact inner workings of the aysnc code (very few people do). But if it works as advertised, then FLAC decoding should not be an issue with its timing.

I agree that networked solutions can provide better isolation that direct connections. Remember, I am not talking about audio streams in general, but the difference between FLAC and WAV files. I am not willing to say that computers routinely make computational errors when compressing and decompressing FLAC files and therefore WAV files are better. If people think they hear a difference, that is up to them. But I have yet to hear a detailed explanation of why that happens that makes sense.

Time to get ready for Thanksgiving.
I just did an A/B comparison with Flacs and Wav of the same tracks in the same playlist, allowing me to switch. I heard no discernable difference.
When I first started with integrating computer audio with my audio system, I used an analog stereo to rca Y IC (audioquest) from my old laptop headphone jack to aux input on my preamp at teh time. I recorded several CDs of music with Real player. I since ripped these back to my current music server setup and play via Squeezebox Touch along with the rest. The sound quality after all that is still quite good but there is some noticeable deficiencies mostly in dynamics, compared to other very good recordings. Still "hi fi" I would say and quite listenable (nothing offensive, mostly just a bit of omission). I say its much better than most home cassette recordings I have heard over the years but not current SOTA. So in many cases with computer audio I think the glass is still significantly more than half full even in less than ideal circumstances compared to past options, unless something is flat out just not working properly as designed.
Excellent comments by all, IMO, on an issue that by its nature is highly speculative.
11-23-11: Dtc
I agree that networked solutions can provide better isolation that direct connections. Remember, I am not talking about audio streams in general, but the difference between FLAC and WAV files. I am not willing to say that computers routinely make computational errors when compressing and decompressing FLAC files and therefore WAV files are better. If people think they hear a difference, that is up to them. But I have yet to hear a detailed explanation of why that happens that makes sense.
What about my hypothesis, that differences in the processing that is performed when playing the different formats result in differences in when and how often "Speed Step" and related power conservation features are called into play (unless the user goes through the steps that are necessary to disable those features), in turn resulting in significant differences in computer-generated noise transients, in turn resulting in differences in jitter and/or noise coupling?

Even if an asynchronous USB DAC is being used, conceivably high frequency noise transients riding on the USB signal pair and/or the associated power and ground lines could couple past the DAC's input circuits to internal circuit points, where they could affect timing of the DAC chip itself, and/or subsequent analog circuit points. Galvanic isolation would help in that regard, as you noted, but it is not always employed, and who knows how effective it is in any given situation?

And then there is the possibility, perhaps somewhat more remote but conceivably still possible, of differences in rfi resulting from those format-sensitive noise transients, the rfi perhaps bypassing all of the digital circuits that are involved and coupling onto sensitive analog points elsewhere in the system.
11-22-11: Mapman
But the format itself does not correlate to sound quality in general though. Lots of other crap can go wrong and chances are it does so differently because of different hardware and software processing scenarios for different formats. The devil is all in the details. But not in the source format itself. If processed properly, the results are the same. That can be a big if though.
11-23-11: Mapman
Most general purpose computers have no business being connected directly to your high end audio gear! Think of this [network playback] as a form of isolation, similar to other steps you might take to isolate your rig from potential sources of noise.
Well said! Agreed 100%.

Best regards,
-- Al