A 'good' answer is in the mind of the beholder.
I suspect there's no difference in conductivity after conditioning. What gets conditioned, mostly, is the insulation.
Jefferybehr is correct. In addition, at least according to the transcript I read of a formum of power cable/interconnect manufacturers, the consensus seems to be that the conductor<>insulation interface goes back to its pre-break-in state after a period (several weeks) of non-use. Also if IC's are reversed in the system, they have to be re-broken-in so that interface can be (molecularly) reoriented. At least that's what I read FWIW.
Nsgarch's additional info is consistent with my understandings.
The speed of insulation polarization is comparable with speed of light so cable "breaks-in" as soon as you connect it to the signal.
"The speed of insulation polarization is comparable with speed of light so cable "breaks-in" as soon as you connect it to the signal."
Really? Nah...come on...this is a joke, right?
Not a joke.
Read any introductory book on Electro-statics and there you'll find an answer.
The break in period depends on the conductor, conductor geometry, and dielectric. Ask any good cable designer; pay very little attention to free advice offered by those who quote introductory books as their source.
Marakanetz: I haven't the knowledge to debate what you say, but again, the cable manufacturers state that it takes awhile (the break-in time) for the insulation/dielectric to react/respond to the current flowing in the conductor. And that after a time, the molecules in the insulation material (usually foamed Teflon these days) at the interface with the conductor are altered by the current flow in a way that creates even better dielectric properties than when new.
What I don't understand (assuming that's all true) is why it makes any difference to the Teflon which way the current is flowing, since it's non metallic. Probably has to do with Quantum Theory, of whaich my knowledge is rather shallow.
If you have a high-resolution system, you can definitely hear a difference in brand new, unbroken-in cables and cables that have been properly broken in. I don't presume to be able to explain the physics of exactly what's going on here, and it may vary considerably from cable to cable, given the great diversity of designs, but anyone who has had much experience in this realm can confirm this; it's definitely not audiophile neurosis or hocus-pocus.
Within the past year I had the experience of ordering a complete new set of Purist Musaeus cables (two sets of interconnects and one set of speaker cables). When I initially hooked them up, I was somewhat disappointed. The manufacturer stated that the cables required a minimum of 100 hours of burn-in. Well, I might once have been skeptical of such a statement, but I can confirm from personal experience that after 100 hours, this set of cables sounded quite audibly better--smoother, better balanced, with better resolution of detail--than it did when brand new. Later I ordered an additional pair of new Purist Audio Musaeus interconnects, and the same experience was repeated: they sounded quite audibly better when they had completed break-in. This experience has been repeated by so many audiophiles that it is commonplace and well accepted by those who aren't so blinded by theory or what some textbook told them that they won't believe what their ears tell them.
cable manufacturers as well as dealers interested to pitch their product. thus to explain rediculousely high prices, sound becomes better after a few hundreds of hours of using just exactly what you need to psychologycally train yourself for that.
i may recommend non-introductory books as well but they require a prior basic knowlege to understand all differences between conductor, dielectric and semiconductor.
No doubt about it Texasdave. Way too many people have shared your experience. My Stealth Indra took over 200 hours to really hit their stride. Could be Marakanetz just doesn't have the gear or ear to hear the difference. Nice rhyme huh?
Texasdave and Stanhifi,
I'm with you guys on this one and I think most with high end systems would agree. Burn-in time varies with different cables. Also, some cables get cooked some by the manufacturer. This holds true for componants as well. I don't know where Marakanetz is coming from. He's entitled to his opinion of course but it dosn't conform to my experience.
I also think that break-in time is inversely proportional to the current through the cable. Which is why PC's break in faster than speaker cables which are faster than IC's. Tonearm IC's are the slowest, and in fact I'm thinking of making an RCA to DIN adapter so I can use my tonearm IC somewhere else in the system for awhile.
Marakanetz, I'm sure you're right re: the physical properties of the materials, however then you should look for another reason for the phenomena we hear. It can't all be marketing hype or auto-conditioning. After all, ten thousand pairs of ears can't be wrong.
Stan, I'm sure you're really a sweet guy in person, oh and you should add Siltech and Magnan to your list of great cables :~))
Folks the bottom line is (no brainer, realy) listening to cables is certainly different from listening to music. Another words is all that phenomena can only occur when you redirect your attention from tunes to wires.
How do you listen to wires?
Marakanetz was speaking for himself of, course, when he said "no brainer". That is quite accurate.
Maybe he meant no brain here.
NSGARCH, are you sure that the current in speaker cables is lower than PCs?
If one is feeding 16 watts to a 4 ohm speaker, you have 2 amps of current. The power amp feeding the speaker, even at 75% efficiency is using about 21 watts, therefore barely 0.2 amps are running through the PC.
I doubt that you use power amps at full power to break in the PC and even then a 120 watt amp would have only 1 amp (2 amps if both channels running) as current, certainly less than a speaker cable running at only 16 watts.
I think that the statement about PC breaking-in more quickly than speaker cables due to higher current, needs to be re-examined.
Salut, Bob P.
Why is it that you cannot have a civil conversation with anyone? In every thread, you seem to want to forgo the discussion, and head straight for the insults.
Obviously, you have a lot of experience in this hobby. You could certainly opt to share it with this community in a respectful manner. Please consider adopting such a tone. Thank you.
Boa2 Take care of your own house if the wife allows it.
Bob, let's say you've got a SS amp which draws 225W at idle but draws 1200W at rated output (200W/400W into 8/4 ohms). So at let's say a nominal (audio) output of 100W/200W into 8/4 ohms, it's drawing around 600W from the wall. W/V=A, or 600/120=5amps.
Typical amplifier nominal output voltage is around 50V for a 20dB voltage gain (over the preamplifier output) and which is a pretty loud listening level if the speakers are reasonably efficient. Again using W/V=A, you get 2A (@100W) for an 8ohm speaker and 4A (@200W) for a 4 ohm speaker, unless I'm way off somewhere.
An example would be my Levinson amp which will provide 400W/ch into my 4 ohm (nominal) electrostats, but at the loudest listening levels I can stand, it's only drawing 400W from the wall (or 3.3A) and it's only putting out around 150W rms of audio power, which at its 67V (26dB) gain, is only around 2.2A to the speakers (vs. 3.3A from the wall.)
The example you gave assumes an 8V output voltage which would be only about a 3dB voltage gain for the amp. Not very loud, even with a super-efficient speaker.
so you guys agree there is no change in conductivity? only a change in the insulation?
Hemi: There are three things I'm aware of that affect or improve the conductivity of a given piece of wire. Two have to do with the crystalline structure of the metal:
1.) Working with the direction of the "wire draw", and honestly I don't know which direction has the better conductivity -- in the direction of the draw or against it.
2.) Cryo treatment, resulting in a more compact crystalline structure which improves electron flow through the metal.
3.) The third has to do with the cross-sectional geometry of the conductor -- ribbon vs. square vs. round, etc. and I don't think there is conclusive evidence regarding this issue.
thanks Nsgarch. my friend thinks he knows it all and he will be in town this weekend. this will give me a little better of idea(being this is a little over my head)of how full of crap he is or isnt. thanks guys
Nsgarch, I don't know how you calculate current to a speaker for a known power consumption, but your example using your Levinson is incorrect.
"An example would be my Levinson amp which will provide 400W/ch into my 4 ohm (nominal) electrostats, but at the loudest listening levels I can stand, it's only drawing 400W from the wall (or 3.3A) and it's only putting out around 150W rms of audio power, which at its 67V (26dB) gain, is only around 2.2A to the speakers (vs. 3.3A from the wall.)"
If 150 watts are being fed to a 4 ohm speaker, then I2=150/4=37.5. Therefore I (amperage) = 6.12 amps.
Clearly that is higher than the 3.3A pulled from the wall.
At any rate the amperage to the speaker will always be higher than the amperage from the wall to amp, because the voltage to the speaker for the same wattage as pulled from the wall is lower then the wall voltage, therefore the amperage must be higher to be of the same wattage.
inpep: You are using the formula A = (square root of) W/R.
I am using the formula A = W/V. There is also another formula (ohm's law), A = V/R.
They should all yield the same result, so perhaps we're just plugging in the wrong numbers? Additionally, there are some cofactors when using AC, although I'm pretty sure the output of an amp has no phase angle.
There's a neat formula wheel at:
and on the following page.
Nsgarch, I am plugging in the correct number. I don't think that the V (I don't know where you get 40 volts) that you are using is correct. Why use V at all, when we have R and P and a formula to calculate I?
24.6 V is the correct voltage at the amp to give 150 watts into 4 ohms.
As I have stated before, the amperage from the amp to the speaker will always be higher than the amperage to the amp from the wall, since the voltage applied to the speaker for the same wattage as the amp will always be lower than from the wall (120 V). I don't know of any speaker requiring more than 120 Volts for them to work!
With respect, Bob P.
Bob, I think what you're forgetting is that the voltage output of a given amp is a constant just like the voltage in a wall socket. The exact amount of voltage is is a function of the gain multiplier the amp is designed for, which for most amps (regardless of output capacity in watts) is about 25dB +/_ which translates into about 60V +.
Amp output in watts is determined by the strength of the input signal as you turn the volume on the preamp up or down. And the current (which varies with the amount of watts the amp is putting out at different volume levels) is a function of the impedance of the load being driven.
So what I'm trying to say is it's the watts that an amp puts out that changes with the volume. And since the load (usually) and the voltage are constant, the only other variable is the current. That's why little amps run out of gas (clip) when trying to drive current-hungry speakers (like big multiple driver boxes or stats with low impedance) because they can't get the watts/current they need to produce decent sound pressure levels.
So thinking of an amp as a great big "equal sign" with ohms laws on each side in a balanced equation is not how things actually work, plus there's also the additional issue of amplifier inefficiency to take into account.
Nsgarch, a speaker's volume output varies with voltage and the power consumed is a function of the current and the voltage at the speaker for that voltage.
The amplifier takes in a signal of a certain voltage and increases the voltage to a level which the speaker can respond to. Of course the voltage at the output of the amplifier varies and isn't constant (otherwise the speaker would not get louder if the voltage did not increase) and the speaker draws what ever amps it needs at that specific impedence to produce sound at the given level. So a speaker that has 2.83 V/8ohm/90db sensitivity figure will need 28.3 V for a 100db output and 10 watts consumption. The amp will be suppling about 0.35 amps to the speaker and itself pulling 0.12 amps from the wall, still lower than the speakers.
Your figure of 40V is if the amp at full power at what ever wattage it is capable of supplying (or current) into the speakers at the speakers' impedence. If that power is 400 watts then the current is equal to 10 amps. The current drawn from the wall at 400 watts is, of course, slightly higher than 3.33 amps, still lower than the current going to the speakers.
With respect, Bob P.
Break in takes about 2 weeks average playtime. Biggest differences occur around 48hrs. By the way, I've found MIT shotgun or better level have always outperformed other cables...I've owned alot..I mean ALOT!!
>>Break in takes about 2 weeks average playtime<<
NO NO NO. There is no "average" time for cable break in. It depends on the conductor, conductor geometry, and dilectric. It's different for all cables. Same thing with "biggest differences"; some are linear, others are not. Ask any cable designer.
Your opinion is erroneous and misleading. Readers should take it as your opinion as it is not based on any empirical data.
Stanhifi, you can also say that any break in time is entirely an opinion and not based on any emprical data, as you put it.
Dave B's opinion of 2 weeks average time is no more erroneous than your opinion that there is no "average" time, and no more misleading than yours.
Salut, Bob P.
It is not my opinion sir; it is based on conversations with a number of cable designers. Thanks for your informative contribution.
Stanhifi...do you listen to music or just measure it and talk about it! I've most likely owned more gear and cables then most. I have also talked to many learned audio guru's/owners of cable companies...most would not agree that over 2weeks of 24/7 break-in is beneficial. Most cable designs offer mostly inconsequential differences in the sonic presentation...not overall fundemental improvements. Everytime I have heard a properly setup system with MIT cables, the experience was nothing short of breathtaking.
"Most cable designs offer mostly inconsequential differences [after break-in] in the sonic presentation . . . ."
That has not been my experience, and it has not been the experience of other audiophiles I know.
>>I've most likely owned more gear and cables then most.<<
Maybe most but not me. My post is correct; whether you agree with it or not is due to your lack of experience.
Oh, Stanhifi, its the "cable designers'" opinions, then, and other "anecdotal" evidence or experiences that render others' opinions wrong? Interesting, but not very informative, in my opinion, of course!
What I said is that compared to a well engineered cable like MIT, most other cables offer inconsequential differences. Relativity my dear Stanhifi! I also thought that my newest cable was THE ONE on several occasions. Over time you hear the problems and the similarities between cables...most are operating in 2 dimensions compared to MIT!! Who has the patents??? Did you ever have a cable rig for over $20K and dare to admit it was not the best thing going? I have! I've also owned just about any major cable you can mention..and alot of GIANT KILLERS...all do something...none do it all...MIT deals with all aspects of sound reproduction.
Stanhifi...I must admit that after A/Bing Monster cable MCX-1 Bi-wire speaker cable to my MH750Shotgun speaker cable..well, the Monster cable was SOOO MUCH BETTER!! Price $60 vs $1200 retail!!