Isolation/coupling: basics?


I feel I need some education in this regard, and I guess I'm not alone... I read most of the discussions about it, but I couldn't find the basics: why?
Could anyone who understands the physics behind all this explain why those vibrations, resonances, and energies are that bad, especially for components without moving parts, such as amps?
dmitrydr
Eldartford, just to make sure, are you sure Reed-Soloman error correction is implemented in regular CD players? It seems that CDP and long distance signal transmission are very different applicaitions; this method seems to be good when you need to recover the most possible info when retransmission is not possible, but not when you need to get absolutely 100% of information or to report a error. So, I'm not really sure this method is being used in data storage techniques, when re-read attempt doesn't cost much. When CDROM reads a data file (I'm not sure if the data is retrieved using Reed-Soloman error correction or not) and detects CRC error, it does perform re-read on lower speed. Just because of that it is possible to get a bit-perfect copy of CD using digital audio extraction, which is generally impossible using audio CD player.

Whoever is right, it leads us to conclusion that CDP is error-protected enough to be immune to vibration too. But it conflicts with a practical experience: people claim, at least, changes in dynamics when isolation/coupling device used...
I don't personally make audio CDs, but audio CDs are often cited as an every day application of Reed-Soloman error correction. CD ROM protocol on your PC may very well be different.

One thing that I learned recently is that in current engineering practice the purpose of error correction is not to correct errors. Rather, error correction is used so that the data transmission can be run at a much higher speed than that which the hardware would support without errors. Correctable errors are expected to occur. You give up some of your bandwidth to redundancy of the coding, but you more than make it up in transmission speed.
If error correction and speedy digital data processing are good things, why is it that CD "burns" made at high speed typically sound "inferior" than a burn made at 1X or 2X ? At the same time, why do burned CD's have increased readability from player to player when dubbed at lower speeds than those at higher speeds ? As far as i know, digital is NOT like analogue where spreading the signal over a wider / longer surface area increases dynamic range. I'm not trying to be a smart-ass here, just trying to better understand what's taking place and interject "real world" situations into this theoretical debate. Sean
>
sean...When someone asks a question that we can't answer off the top of our head we generally say "send us your data". So: what evidence exists (other than anicdotal) that CD copied at low speed "sound better" and play more reliably?

In this case I will make an exception and take a stab at it. The Reed-Soloman coding used for CDs (and for any other application) is configured to deal with a certain bit error rate. In addition to the R-S error correction process I believe that CD players implement a data interpolation process to minimize the impact of errors that are not fully correctable, perhaps because of physical damage to the disc. Discs copied at high speed may have bit error rate in excess of the design value even when there are no scatches, interpolation may be invoked, and you can hear it. Warning...this is just my educated guess.

I will not attempt to explain how error correction codes work because I would not do it very well. I know there is an excellent explanation on the web (and a lot of confusing ones too) and I will try to find it again and point you to it.