mV output from cartridge, please explain


I understand that cartridges are rated by how much output voltage they generate when using certain standard test records. How does this relate to real life? If a cartridge is rated at 1 mV then is that the average level at an average volume? Or is it a maximum? Or what?

The reason I ask is I am looking at the overall dB gain of several phono preamp---line preamp---power amp combos and I'm trying to decide if I will have enough gain to drive my power amp to full power.
herman
Most phono stages will accomidate this cartridge.It is only when you get down to .3mv or lower that you will need a "deluxe" phono stage or step-up transformer.
The rated output of a cartridge (such as 3.0 mV, or 0.5 mV) is the voltage of the output signal, usually based on the stylus tracking at a specific rate (for example, 5 cm per second). Moving coil cartridges usually have voltage output of less than 1.0 mV (although there are a few "high output" MC cartridges with output of 1.5 mV or so). Moving magnet cartridges usually have output levels in the 3-6 mV range (although a few may run higher).

The output level of the cartridge will determine how much gain, measured in dB, that must be provided by the phono preamp or step-up transformer. Usually, moving coil cartridges need about 60dB of gain from the phono preamp (before the signal goes to the main preamp), and moving magnet cartridges usually need around 35-40 dB of gain from the phono preamp.

The voltage needed to drive your main power amp is determined by the output voltage of the linestage preamp, not the output of the cartridge / phono preamp combo.
How it relates to real life is related to the output voltage times the amount of gain that it is subjected to all along the signal path. It starts out at, say 1mv, and then goes into a phono section. It then goes thru the gain stages of amplification and the voltage is increased by whatever value of gain those amplification stages have.

That's where the rubber hits the road. You have to have enough gain to bring the voltage up to the point where you can drive the input of the amplifier to full volume. This spec is called the input sensitivity of the amp. If the input sensitivity of the amp is 2v, then the maximum output of the preamp must be at least 2v in order to drive the amp to maximum output. If the gain in the phono stage and linestage added together do not generate the full 2v, then the preamp will not be able to drive the amp to max level. This could happen when the cartridge output is lower than the gain stages can increase to the 2v level. So you have to have enough cartridge output to supply to the preamp to allow it to amplify it enough to drive the amp to max. If it is too low, then you have to crank the volume wide open to get decent listening levels. If it is too high, then you can barely crack the volume controls and it gets real loud. Also, if it is really too high, then you can overload the phono section inputs and cause distortion.

The reasons that it is hard to just give a flat figure for this, is that all phono stages and linestages have different gain specs. Some are high, and some are low. In addition, many amps have different input sensitivities, so a front end that may work perfectly on an amp with 2v input sensitivity would be way to hot for an amp with 0.3v input sensitivity.

The best way to calculate this is to take your cartridge output voltage, and figure out what gain amount is needed to provide the necessary output voltage to match, or slightly exceed the input sensitivity voltage of your amp. This way you will know that you will be capable of driving your amp to full volume with the voltage that comes from your preamp output. Then compare this gain figure with the specs for gain in your phono stage and linestage combined.