Isolation/coupling: basics?


I feel I need some education in this regard, and I guess I'm not alone... I read most of the discussions about it, but I couldn't find the basics: why?
Could anyone who understands the physics behind all this explain why those vibrations, resonances, and energies are that bad, especially for components without moving parts, such as amps?
dmitrydr
I don't personally make audio CDs, but audio CDs are often cited as an every day application of Reed-Soloman error correction. CD ROM protocol on your PC may very well be different.

One thing that I learned recently is that in current engineering practice the purpose of error correction is not to correct errors. Rather, error correction is used so that the data transmission can be run at a much higher speed than that which the hardware would support without errors. Correctable errors are expected to occur. You give up some of your bandwidth to redundancy of the coding, but you more than make it up in transmission speed.
If error correction and speedy digital data processing are good things, why is it that CD "burns" made at high speed typically sound "inferior" than a burn made at 1X or 2X ? At the same time, why do burned CD's have increased readability from player to player when dubbed at lower speeds than those at higher speeds ? As far as i know, digital is NOT like analogue where spreading the signal over a wider / longer surface area increases dynamic range. I'm not trying to be a smart-ass here, just trying to better understand what's taking place and interject "real world" situations into this theoretical debate. Sean
>
sean...When someone asks a question that we can't answer off the top of our head we generally say "send us your data". So: what evidence exists (other than anicdotal) that CD copied at low speed "sound better" and play more reliably?

In this case I will make an exception and take a stab at it. The Reed-Soloman coding used for CDs (and for any other application) is configured to deal with a certain bit error rate. In addition to the R-S error correction process I believe that CD players implement a data interpolation process to minimize the impact of errors that are not fully correctable, perhaps because of physical damage to the disc. Discs copied at high speed may have bit error rate in excess of the design value even when there are no scatches, interpolation may be invoked, and you can hear it. Warning...this is just my educated guess.

I will not attempt to explain how error correction codes work because I would not do it very well. I know there is an excellent explanation on the web (and a lot of confusing ones too) and I will try to find it again and point you to it.
Thank you El. I'm not saying that you are wrong, but that my experiences and those of many others seem to contradict some of what you are saying. I also know that theory is called "theory" because it is not always "reality". Granted, "theory" is reality in many cases, but the manufacturers "short cut" approach won't let the design perform as it should. Sean
>
Guys, extraction/write speed doesn't directly relate to a sound quality, it depends on HOW do you extract the audio data to HDD, and what CD-R drive you use to write it back.

Getting back to the subject :)...
Sean, you play a lot with isolation. You haven't find any satisfying explanation of the phenomenon (impact of vibration to a CD player other then just data read error which may not be a problem if Eldartford is right) from engeneering perspective?