Major help needed in Input and output impeadance..


Okay, so if you have a 10 kohm or 10,000 ohm input in amplifier terms, what are the parameters for the output on the preamp? I see anything ranging from 50 ohm to 600 ohm in preamps outputs. And see many amplifiers from 10 kohm , 20 kohm, 30 kohm, 50 kohm etc... I know there is a rule of thumb but what is weird Is most of the manufactures I see building amps with 10 kohm up to like 50 kohm are building preamps with only about 50 ohm outputs.... nothing near what some other manufactures use between 220 ohm to about 750 ohm.

I will have a 600 ohm out pre soon and my amps are only 10,000 ohm inputs. so what does this mean? it almost seems that the 600 ohm may not be compatible in looking at the matching most manufactures use as they are all much lower output impeadance in general vs. my preamp, so what happens, I get less gain, more noise, what can be a problem if you have mismatched impeadances between these two components? Or is all this irrelevant and I just need to get the preamp and listen and not worry about it? But I swear I read something about this topic at some time and really want to make sure I know what I am doing.

Thanks, Ps. in my post I did use K- for increments of 1000 if anybody got confused when I wrote out the whole number and then started using the abbreviation with 'K'
matrix
You optimize voltage transfer if the preamp output impedance is minimal and the amps input impedance is maximal. However, there are other problems when these are pushed to extremes (noise/RF pickup, etc.) and, once you get near two orders of magnitude difference, further improvements are asymptotic. (600ohms into a 10k load should be OK as long as the preamp has adequate current.)

Some manufacturers don't rely on voltage transfer but, instead, are current-based and these depend on matching the input and output impedances.

Kal
I could be wrong.But I'm more concerned with the actual gain and output voltage of the preamp. The lower output impedance seems normal to me. Your preamp should do just fine. I always equated the preamp's output impedance with how well it could handle the impedance of the interconnects or load of the amplifier's input.The lower the output impedance, the less the length of the interconnect becomes factor in how it performs. I would think too much gain or output voltage could be more of a problem.You either have a hair trigger volume or one that has to be turned up more than half way before you reach a comfortable volume.

If my amplifier is of high gain say 25 or 30dB and it only takes 1 volt to reach full power. I wouldn't want a preamp with 30dB gain that claims 15 volts output. This would be like using a Megaphone while talking on a microphone..I think you get my drift ;-).Like I said I could be wrong but this is what I look at.Someone more technically inclined than myself can put it in engineering terms.
Hmm well the amp has no spec. on the input gain shown, but the pre amp is from 2 volt to 10 volt output, so if I have a 2 volt cd player I would guess I get 2 volt out of the pre, maximum out would be 10 volt, so in my case I do have a 4 volt out CD player so I assume it will pass 4 volt down to the amp... Now I would guess the only way to drive the pre to its full 10 volt out would be something like a cd player with 4 volt into an equalizer or something that boosts the signal another 6 volts and then you would be at max, so I am not worried about the preamp being overdriven really cause no source I own or even seen sold would put out that kinda voltage in general maybe some phono stage?, but it does not have the gain spec. shown on the preamps papers either, but I would guess it is 20 db gain stage.

It does have the signal noise ratio of 112db which is pretty good from what I understand, but I dont know this spec. tells us anything, also it is capable freq. range of 10hz to 200,000 hz, so its a high end piece, not the standard 20hz to 20,000 hz most are specked at, again this all may mean nothing?
Voltage-gain/sensitivity issues are different than impedance 'matching'. The rule of thumb on output/input impedances, the original issue, is 1:20. The 'rule' indicates that if the 2 impedances are at a ratio of at least 1:20, the preamp will be able to deliver full Voltage into the amp. If the ratio is less than 1:20, the preamp MAY not be able to deliver full Voltage and/or the preamp/amp system may be more susceptible to the characteristics of interconnect cables than we'd like.

600 Ohms:20K-Ohms is a ratio of 1:33 and should present NO Voltage-delivery or cable-dependency problems.

GENERALLY, one would like very low output impedances and relatively high input impedances, but there are tradeoffs to both. Tubes, generally, are high-impedance devices, and some designers and audiofools believe there are compromises inherent in the use of cathode followers, the most-common method of reducing output impedance.
.
Not necessarily so Matrix.The standard output voltage of most cd players is 2 volts. The standard input voltage is 1.5 to 2 volts on most amplifiers. I've seen amplifiers that need 4 volts to get full power from the unit(the designer claimed quieter operation this way). You should be able to get 10 volts from the preamp using a 2 volt source without problems.The preamp may start at just a 10th of a volt and go up from there on the volume control.You really should be fine with the preamp. As far as the crazy specs go.IMHO the 200 khz is pretty meaningless. Most of us male humans can't hear past maybe 18khz ..I dont know though some audiophiles do claim to hear like bats ..LOL.

The variation of dbs from point to point would also have more meaning for me.For example frequency response from 10Hz to 20khz +/- 3db or better is very good.Some may claim +/- 1 dB variations..it doesn't get much better than this. Most highend speakers don't achieve such a flat response anyway. The only commercial speakers I've seen claim close to this are the Green Mountain Audio speakers(after hearing them I believe it to). Even then once you get in the lower bass region the variations still drop to +/- 3dB..which ain't bad at all. There should be a certain amount of rolloff to get the correct decay and timbre of certain instruments IMHO.
Gmood1, Thanks I do understand the standard is 2 volt, but I can assure that my Wadia CD player in fact puts out the pro audio standard voltage of 4 volt at the RCA's As well as XLR, as why it much more full and powerful sounding than most players. As for speaker frequency, Mine Go 16hz to 25,000hz, no problem there either, whether it can be proven we hear or feel it is another story, but I guess the frequency's exist thats why they have products built beyond the call of duty. Not sure why but all the top manufactures flagship Preamps are in the 10hz or even 5hz up to 100 khz, or 200 khz, is it necessary? , probably not but I guess these are the flagship specs. cause yes the run of the mill and even fairly high end pieces only claim 20 hz to 20 khz... Who knows, but thanks I will see what this preamp does I was just concerned of the impeadance match, but it should be okay.
Hi Matrix,
I have no doubt the Wadia puts out up to 4 volts I was more or less talking about why 2 volts is more than sufficient in real world applications.Sorry you may have misunderstood me.As far as those flagship specs go.Look a little deeper and you'll see equipment that's not held as flagships (because of price)that have these qualities.IMHO it's more to do with well engineered gear than it having a flagship stamp.Not sure if the Wadia has this feature. But if you want fuller sound and dynamics have a modifier to put inline output transformers(bypassing the caps or op amps on the analog output of the player).That's if it uses dual differential Dacs like some Denon,Pioneer and Esoteric players.. This is what gives you the full potential of the player's dynamics and transparency.Most stock equipment regardless of cost doesn't have this.You will find it in some DIY gear though..since manufacturing cost isn't an issue.

Well enough babbling on my part... enjoy the new preamp.

Good listening
I know what your saying, I just was reiteratting the fact for some reason the "Flagship" pieces seem to spec. frequency out of this world, whether it will help anything is another question. Actually thats funny I did just have the Wadia modded, with a new clock all caps replaced with Rubycon Cl/Cz (supposedly better than Blackgates, but Rubycon makes the blackgates so I don't know), WBT next Gen jacks, Silver Rectifier bridges etc... and they told me the output stage in the wadia is about as simple as it gets with one Opamp and thats it, they believe it is very good analog out stage already, but could get the silver transformers put in but we are talking a 1000.00 mod, and I am skepticle going that far to be honest.
Thanks for the tips, I hope the Preamp works out Too!
Jeffreybehr, I do have a Tube Hybrid preamp, But you lost me a little... So I should be capable of full output with 600 ohm preout's, and 10,000 ohm inputs on the amp? Sorry you used a spec double that as an example and got a little confusing, thanks
Jeffreybehr, What I got from what you are saying is that my ratio would be 1:17 ? Which seems that I will not get full output then, but maybe I am not getting it thanks again.
Matrix, the 1:20 ratio is just a rule of thumb, and I suppose there are almost as many systems with ratios under that number that operate and sound great as there are as systems with ratios over that number that don't operate perfectly and don't sound great. I suppose your 1:17 is marginal.

I can't do more than speculate from here, but if I owned and loved those pieces and could configure my system with SHORT ICs and long speakercables, that's what I'd do.

How long are the interconnects?
.
Matrix, unless you sense a slight loss of dynamics and/or softening or absence of low frequencies, you are driving yr amp ok.
The matter of impedance "matching" Jeffreybehr refers to applies in cases where you need to transfer max energy -- in which case you need the same impedance (typically 600ohm). You're transfering voltage.

On the subject of hi freq reproduction: equip that can play say 100kHz +-0,1db is useful for its apparent speed, not because we can hear those freq. It just shows how linear a reproduction can be across a wide freq band. As an example, my amp specs 100kHz 0,1db/500kHz 3,1db 2MHz -12db. It's not quite linear; the losses accelerate as you go up.
I got you, just need to experiment, use the best equipment possible to get the job done, thanks guys.
They are 6 ft... could be possibly shortened a bit maybe to 4 ft if necessary, but they go to mono blocks so got lower margin for adjustment for the distance.
Okay guys, heres what I found

Balanced audio tech in fact builds a preamp with 750 ohm outputs, it states on there chart it is recommended not to go below 10,000 kohm amp... they have models capable of even driving some rediculosly designed amps with 3,000 ohm inputs but need to use their Optional "Bat Pak"

I also called McCormack Audio, I do not own their equipment, however they have there Flagship amplifier the DNA-500 at 10,000 ohm input... It is in the words of their engineers that Anything over 10 times your preamp impeadance is perfectly fine, So if it is 600 ohm its good to even drive a 6,000 ohm impeadance Amplifier, which is rare to find, but they even went on to say the most Weird and hard to drive amp they have seen to date(and he could not remember the name) was at 2,000 ohm input and they believe the only preamps they would have available to drive it is from their sister company Conrad Johnson with something in the 50 ohm output range...

So from what I found is 1:20 ratio is not the rule it is 1:10 I guess, which now thinking back I have heard multiples of 10 were always the rule? So I guess we are okay, Thanks guys it was good info.