No..nothing wrong with them...and believe me, I wouldn't have tolerated a sonic compromise (and I was absolutely expecting one). Futher, with the added benefit of using the USB connection (if you are using an IMAC...don't use the headphone Optical route)... the sound quality is outstanding. There are reportedly technical reasons for the USB superiority...To the point that some feel an IMAC with a decent USB capable DAC could ecclipse the perfomance of all but the most expensive stand-alones.
In my opinion, totally hi-fi approved...I will never go back to dedicated CD players...never!
Check out Wavelength Audio and Ultra Fi Audio's websites. Both of these guys are super high-end audio manufacturers of SET amps and USB DACs (interestingly they both live here in Cincinnati).
Both of them use simple Apple Lossless ripped to iTunes on a Mac. Both of them could use whatever they want, but they choose the simple Apple Lossless and a Mac connected via USB to their DACs. This is what they use at shows where they are putting their best possible product forward in order to sell their product lines.
I think that is a pretty good endorsement of Apple Lossless via iTunes. BTW, I think they both overwhelmingly prefer Mac over PC for ease of use and sound quality. However I think they also both say the PC sounds good if you use Windows Vista.
Thanks for the replies. I have no trouble believing that hard drive based audio can be as good or better than standard spinning a disc in real time, I just wanted to make sure that the Itunes importer doesn't have any problems with error correction or anything. I don't want to have to rip twice. Sounds like it's a go!
I like the ritual of playing CDs so I'll continue doing that, but my plan is to be ripping my CDs (and my vinyl at some point) in the meantime mainly for my Ipod (used in car and traveling), and to be able to try hard-drive based hifi at home when the time is right (either through a wavelength dac, modded sb/transporter, or empirical audio solution)
FYI - I'm actually ripping on a PC (to redundant terabyte drives), but I use Itunes because of its Ipod/Iphone integration, and clean UI. I use a Macbook laptop also, but that's not where I do my ripping.
As I understand Larry Moore regarding this subject, he prefers the MacMini over the Macbook, etc., and as best I remember, he says that the MacMini has a better audio quality than the other Mac models.
Please understand that I am stating this to the best of my recollection... btw, I followed his advice in this regard, and I am very pleased with the result.
He also recommended ripping to iTunes using Apple lossless, and I am really wondering about my stand alone CDP... the DAC/MacMini playback is so excellent, easy to use, and offers lots of other multimedia opportunties.
I would rip to a format that's not controlled by one company. Flac for example.
Thanks for the comment, but I chose apple for a smooth integration with my iPod, iPhone and iTunes. Currently this is all I am using to play back these files.
If it comes down to it, they can be converted back to wav, and then to another format.
Gordon Rankin has found that the CPU speed affects the playback of Apple Lossless even though the CPU shows minimal use during play. My iMac 2.8GHz Core 2 Duo sounds the same with AIFF and Apple Lossless. My Mac Book Pro 2.01 GHz Core 2 Duo sounds better with AIFF. You can use the convert feature in iTunes to convert files to AIFF and it will leave your Apple Lossless files. Try it and see how it sounds on your MacMini.
My experience has been different from Ml8764ag's. I prefer the sound of the Toslink out of the Mac Mini rather than USB (use a good Toslink cable). If you are using one of the leading edge USB DACs (Wavelength, Empirical, UltraFi), none of which I have tried, it may be a different story.
Steve, thanks for the thoughts and ideas... I'm just a little confused about what you said (confused, not trying to be contrary!) You wrote:
>>Gordon Rankin has found that the CPU speed affects the playback of Apple Lossless even though the CPU shows minimal use during play. My iMac 2.8GHz Core 2 Duo sounds the same with AIFF and Apple Lossless. My Mac Book Pro 2.01 GHz Core 2 Duo sounds better with AIFF.<<
As I read it, (just checking) are you saying that the slower processor sounds better with AIFF? Just for the heck of it, I'll give it a try... converting an Apple lossless rip to AIFF in iTunes on my MacMini... right?
I'll let you know what happens and my thoughts... thanks for your thoughts on this,
btw, Steve, my Minis are 2.0 dual core 2s.
Did you mean to suggest converting existing Apple lossless files to AIFF or reripping into AIFF (which is what I am doing right now with a selected CD).
Steve... shocking and true.... quite a difference. More air, quicker percussion, and more pinpoint sourcing, wow...
So now... the annoying question... do I have to rerip all of my CDs or is there a magic mouse click hidden deep in the bowels of the iTunes interface?
:0 ! listening
The slower the CPU, the worse Apple Lossless will sound. Faster CPU, no difference between AIFF and Apple Lossless.
You don't have to rip your CDs again. iTunes can convert you Apple Lossless to AIFF. You can then delete your Apple Lossless files if you don't want them. If you are using iTunes 8.0 do the following:
Select Preferences; General; Import Settings; Change to AIFF. Select the songs you want to convert. Click the Advanced menu and you will see "Create AIFF version.
iTunes cannot convert 24 bit stuff. It changes it to 16 bit. This should be no problem with CDs.
Steve... thanks... I am sure other readers appreciate your suggestions as well.
I noticed that the Apple Lossless rip sounds more "tube-like" on my ss system (and slightly veiled when compared directly to the corresponding AIFF rip).
On my 7591 tube system I noticed a dramatic improvement in clarity, presence, "musicality," as well as speed (go figure!) using AIFF when compared directly to the corresponding Apple Lossless rip.
So now I am wondering if Dave Matthews is right... "too many choices...". Anyway, thanks... thanks very much!
I've had a number of customers report that AL files dont sound as good as .wav or AIFF. Just anecdotal. Mostly whan transmitting WiFi with AE or AppleTV.
It's the choices that make this fun!
Has anyone else tested out the differences between AIFF and Apple Lossless on Mini-based systems?
If you have, I would be very interested in reading any subjective reports about the differences perceived.
I've just started ripping Apple Lossless to my new iPod. When I play them back (via Apple dock through my hifi) I must say the resulting quality is very impressive, and I'd be happy to listen to music most of the time this way.
That said, when I did a direct a/b comparison against the original CD (Cambridge 640C player), both through a NAD amp, the CD still had the edge in terms of dynamics and detail.
My conclusion is that lossless is fine for all but serious/critical listening.
You may have reached the right conclusion (or not) but for the wrong reasons. There are a lot of variables in play in your test, not just the file format of your rip.
The slower the CPU, the worse Apple Lossless will sound. Faster CPU, no difference between AIFF and Apple Lossless.
How fast is fast? I.e., at what speed do you notice no difference?
Hi Syncrasy... don't know. I 'm using MacMinis with 2 /GHz Core2 Duo, 1GB 667 MHz DDR2 SDRAM on board for each. No problems. I think (therefore I think I hear) a difference with AIFF files vs Apple Lossless... but the powers of suggestion overwhelm...
(if you know what I mean)
Hi Ed. Actually, I was trying to get Splaskin's attention, since he's the one who made the claim about speed. (I've got a 1.67GHz PowerBook G4.) I'm not sure I would notice a difference anyway. The PowerBook currently is connected to my system via the 1/8" headphone jack, and I can hear only a slight difference between Apple Lossless and the original CD --and that's when I know which source I'm auditioning. If I were to do a blind test, I doubt I could tell which one was playing. (I guess I'm not quite audiophile material, yet.)
To follow up... I did my own comparison of Apple Lossless vs. AIFF (using my PowerBook G4 1.67GHz, connected via analog 1/8" jack to an Arcam FMJ w/ Dynaudio speakers). I did a blind A/B test and scored 50% (half my guesses were right, half were wrong). In other words, I could not tell the difference.
Hi Syncrasy... how I understand splaskin is within the (his?) context of USB output from computer to DAC to audio preamp etc. That is my context as well.
Because of the DAC being in the computer with the minijack connection in your setup I really can't make an informed evaluation.
It's hard to say just how fast the computer needs to be to hear no difference in Apple Lossless VS AIFF. My 2.8 GHz iMac sounds the same with a Wavelength USB Crimson DAC. My MacBook Pro 2.1 GHz sounds inferior on Apple Lossless.
I now run all my music using AIFF. Hard drives are so cheap, why bother with Apple Lossless.
In my experience, Apple Lossless is inferior to .Wav, although it is close. Hard drive storage is so cheap there is really no reason to be using Apple Lossless. You can get a 750GB drive for under $150 these days.
One good reason not to use WAV is that WAV files can't store tags, unlike every other computer-based file format. If you insist on storing file uncompressed, I'd use AIFF format, as they stores tags.
There is no reason why apple lossless should sound any different than wav, unless the decoding device does not handle Apple Lossless directly and the PC is too slow to handle the decoding in real-time. An Apple lossless file decoded is bit-perfect with the original CD from which it is ripped (just like a WAV file).
I'm ripping to loasless from CDs using error correction during the rip. In my home system I playback through an iPod Classic 160G/Wadia 170i/Playback Designs MPS-5's coax Digital Input. When I compare CD playback to iPod playback through the same upsampling DAC, they're the same.
John, you are absolutely right about the tags. That is sort of a big deal I don't want to downplay that issue.
However, I personally believe that .wav is the only way to go for ultra high-end systems simply because it removes all doubt that the format is compromising the sound in some unknown manner, however slight. The fidelity of the source material is important enough to me that I am unwilling to accept even the slightest chance that something is being lost for some reason we don't understand. It is simply one variable removed from an equation that already has too many variables.
The problem is one of confidence. With .wav and error correction you can be reasonably certain that you have taken your archive of redbook CD as far as it can go. So when the time comes to audition that next upgrade or tweak, you can be confident your source is as close to reference as possible.
That said, am I confident I could distinguish .wav files from apple files in a blind test? No absolutely not. Apple lossless sounds very good to me. But again, do we even know what we should be listening for? Do the difference, if there are any, manifest themselves on all recordings, at all volume levels, or on all equipment? Will a more advanced system five years from reveal some distinction that my system today wasn't capable of? I don't know, and I don't have to know because I .wav.
Use compression only if storage space is really at a premium.
If I am not mistaken, if you start out with WAV or AIFF in itunes, on a mac or PC, and transmit the file over a network to an airport express or apple tv, you are unknowingly transmitting apple loseless. I assume this is not the case when connecting your computer to a DAC via USB or Toslink as apple loseless is not supported by any DAC I know of. Itunes does this because transmitting a file half the size is faster and easier I assume. Maybe that is why I cannot hear the difference when I A/B AIFF and apple loseless in my system. Not to start an argument, but AIFF and WAV are both native CD Redbook formats, AIFF being Mac and Silicon Graphics native format, WAV being Microsoft. If WAV doesn't support tags (im not sure on this as I don't use it) you are making a LOT of extra work for yourself.
Does it matter that for FLAC, they can reconstruct a bit-exact copy of the original?
Can the same be said of Apple Lossless? If yes, than they should be 'equal'. And equal to an uncompressed .WAV or other format.
Magfan, I think that bit-exact IS important. I know that there are large differences when I turn on error correction when ripping to lossless, but I'm not sure if it's bit-exact. I can tell you that it's certainly almost-exact and I DO hear differences when I turn off error correction.
I wonder if anyone here knows how exact error correction is with lossless.
Please don't use Apple Lossless--it is by far the worst of the lossless formats and there is a difference. It seems to strangle the life out of my recordings.
However, after retesting on my HD600 headphones and my reference system (dCS Delius+Purcell+ B&W N802) I cannot for the life of me tell the difference between a .flac and .wav file. The same goes for .aiff which is exactly like .wav except you can tag the files. If you want to save space and can do without iTunes, .flac is the only way to go in my opinion. Use Exact Audio Copy. If you have the space, in iTunes .aiff is the best format because it is totally uncompressed but the files themselves can be tagged with track and artist information.
Now with respect to error correction and bit perfect, I am not an expert but I think when you talk about "bit-perfect" error correction and "lossless" you are talking about two different things.
Bit-perfect and error correction deal with the initial rip of the the CD. This is the process by which a program reads and rereads suspicious areas that might contain errors, dust, damage, scratches, ect until it comes up with the right answer. EAC rereads these areas up to 8 times I believe. Because error correction takes the errors one by one and over and over it is possible that a disc that would generate audio artifacts when played in real time due to dirt, dust, scratches, imperfections, ect can be perfectly extracted into an audio file. The difference between a corrected rip and a CD played in real time is that with the rip, the computer has time to think things over, if you will.
Now, after the initial rip, regardless of whether you error corrected or not, the computer generates at least temporarily a raw .wav or .aiff file which is then converted into the lossless format. iTunes does this transparently but if you use a program like EAC, which claims bit perfect rips, you will see that after the initial file is "perfectly" ripped off the CD, EAC actually launches an external program that converts the .wav file into the format you want.
So really there are two issues. First, what impact do error correction and "bit perfect" rippers have on the uncompressed sound files? Second, what impact does converting to a lossless format have on those files?
Here is another question...if there is a difference in sound quality between raw .wav/.aiff and the lossless formats, is the difference the result of the encoding process or decoding process?
It seems to me that if "lossless" is actually lossless, then the following MUST be true: a .wav file, converted to a lossless file, and then converted back into a .wav file, should sound identical to a duplicate of the original .wav file that was never compressed. Try it!
If that is true, and these lossless formats do what they are supposed to do, then the only explanation for any difference in sound quality is that at the time of playback, the process of decompressing the lossless format impacts the sound.
It can't just be a question of CPU power because I am assuming that there is some kind of memory buffer. Further, I host my files on two machines--one is a quad core with 2 gigs of ram at 2.6GHz and the other is a Core2Duo at 3.4GHz and I can still hear difference in the apple lossless files.
Thorough and thoughtful, Blackstonejd. Thanks for those posts. Hopefully we will someday know, definitively, answers to the questions you pose.
Opening another cans of worms, an industry insider recently talked about the playback software having its own sound. iTunes is said to have a "false transparency," whatever that means. Clearly, bit-perfect is only half the battle.