Try comparing your line stage inverted or not inverted. You may not here a difference and then it would not be a problem. Many people cannot here absolute phase differences
42 responses Add your response
My only idea so far is, seeing that I only have one line input (cd), is cutting open the RCA cable and switching + and - of the line, and connecting the speakers to the power amp the correct way. So, I'll do just that.Don't! If the cd player and the preamp both have 3-prong power plugs, or if any other connection path were to be present between the chassis of the two components (now or in the future), doing that would short the cd player's output to ground.
Not sure why you would have a hum as a result of swapping cartridge leads. Are you sure you did that correctly, which usually means interchanging red and green with each other, and interchanging blue and white with each other?
Also, I'm assuming that when you say "phase correct for the phono stage, phase inverting for the line stage," you mean that the preamp output is phase correct with respect to the phono input, and that the preamp output is phase inverting with respect to the line level input. That would mean that the phono stage section of the preamp, in itself, is phase inverting. Otherwise you could simply switch the speaker connections and both sources would be phase correct, since the phono stage output goes through the line stage.
Definitely don't switch the interconnect wires. Way to many risks in doing that. The only way I can think of doing this without getting a different preamp, would be to add a switch to each speaker wire. That way, you can flip a pair of switches and reverse the speaker polarity. It can be done with a pair of double pole, double throw switches.
Phase inverting...whose idea was that? Clearly, I'm not an electrical engineer but who in their right mind wants to reverse leads on their cartridge or change the wires on their speakers just to play a record. If phase inversion can't be done with the flip of a switch, that component (no
matter how good) will never find it's way into my system.
Thanks for the responses.
Can I tell? No, actually I don't. But I just want to do it as correct as I can. You never know. i don't want to find out in five years I've been listening to a sound that could have been a lot better.
Almarg makes a god point - the phono stage does go through the line stage. I don't have a manual, and on the manufacturer's site it just says: "Line Stage phase inverting, Phono Stage phase correct" which sounds pretty ambiguous to me. I've sent them a mail. unless someone here knows the Conrad-Johnson PV12a and knows what it does?
Why does it hum when reversing the wires? No idea. I use a Technics SL1200MK2 with standard tonearm, if that helps. I see a black ground wire going in the bottom of the tonearm, but I don't know where it is connected to. I suspect the ground is connected to the inside of the tonearm (I can't check since the outside of the tonearm isn't conductive). The green wire (R-) is connected to the cartridge body, I can see that. But as far as I can make out, the cartridge body is not connected to the tonearm / ground. I can't figure out why switching the red/green and blue/white would cause a big hum, but it does.
This is simple:
1- Your PV-12A is phase inverting. Reverse the red/black connections on the back of EACH speaker. DO NOT try reversing ICs.
2- Your phono stage is phase correct - so leave the cartridge headshell leads alone (wired normally).
FWIW, I used to own a PV-12L and couldn't tell a big difference whether the speakers were wired in or out of phase. I believe that Magfan is correct - your recordings are going to be both in and out of absolute phase; sometimes from track to track. My advise is to listen to your system connected both ways, see which one works best for you, then leave just it alone.
The only way you are going to be able to hear absolute phase is if the recording is done with 2 microphones only.
The vast majority use more than this. As soon as other phase relationships are introduced by any additional mics, the delicate information that allows you to hear absolute phase is destroyed.
On top of that, as pointed out earlier, 1/2 of your recordings are out-of-phase anyway! Unless they are done with 2 microphones, you will not be able to sort out which ones are which.
While I would quibble about Atmasphere and Geofkait's 50% figure as just a guess that recording engineers are largely indifferent to absolute phase, I would say that with the capability to change absolute phase, I hear very few instances where there is a clear choice. Nevertheless, I understand the nagging concern about having both vinyl and digital in the same phase. It is common for each stage of amplification up to the amplifier itself to invert.
I even went so far once to install double pole/double throw switches on my speaker wires to allow switching of absolute polarity there.
Recording engineers do not take note of whether their mixing board or recorders are inverting or not. Nor does it show up in the manuals for such.
The simple fact is that a recording may or may not be absolute phase inverted. There is no coin, no decision, the issue never comes up. When you send the recording out to be mastered (in the case of LP) you have no idea if the result will be inverted or not. For example the manual for my Westerex stereo cutter system does not mention a thing about it. Neither does the manual for the Masterlink (a digital setup that can do mastering) or the various recording programs we have on computer. Its not a consideration in the studio. Neither does it show up in the manuals for our analog recorders. Its just not a consideration.
So you can be assured that 50% of all recordings are in phase and the other half are not.
So you can be assured that 50% of all recordings are in phase and the other half are not.Ralph, to what extent do you feel that holds true for simply mic'd classical recordings that are recorded in halls, and are produced by labels that are audiophile oriented and/or high quality?
Actually, the question of absolute phase is a very big consideration in recording and mastering.
First, every professional microphone specifies its wiring polarity referred to sound-pressure polarity - this is crucial for any kind of consistency in application of microphone techniques. Also, keep in mind that a large percentage of microphone models are used in both a recording and a sound-reinforcement context, and in the latter case it's extremely important to keep track of absolute phase as the sound of the instruments (or instrument amps) themselves interact directly with the front-of-house and monitor loudspeakers. (Anybody who's worked with older JBL stuff should be familiar with these phasing issues . . . the driver labelling is reversed.)
Second, in either the live or studio context a great number of sound sources and equipment processing loops is ultimately mixed together, at least the relative phase inarguably crtitical. So in practice, the correct connection of all equipment, wiring, and patchbays (observing individual TRS and XLR pinouts, etc.) is a cornerstone of good workmanship in professional practice.
So unless somebody's made a mistake, absolute phase should indeed be preserved all the way through the studio recording chain, and also through the chain at the mastering studio, especially if it's digital. For record lathes, I know that the Neumann and Ortofon cutting amplifiers are very clearly specified as far as their absolute input phase, and of course the cutting-head MUST be properly phased to the amplifier or it will oscillate and destroy itself.
Now there are generally three places in the recording chain where phase can be deliberately manipulated - when tracking, on mixdown, or in mastering . . . through the use of a phase-inversion switch on the mic preamp, console channel strip, or mastering console. In practice, all of the switches start out "non-inverted", and the phase of a particular channel/microphone is inverted only when necessary for specific interactions . . . for example, when two mics are used for the top and bottom of a snare drum. Inverting a single microphone during tracking is generally frowned upon; keeping the audio as un-molested as possible until mixdown is the usual goal. For the overall absolute phase, the mastering engineer usually makes a final decision.
As for Al's question, most of the guys I've met who record on-location with minimal microphone techniques pay very close attention to absolute phase, especially with M/S and Decca Tree configurations. If there are any distant "hall" omnis, they'll usually phase these to preference, while keeping the primary mics uninverted. They also almost always track directly to digital, making it easy to keep everything the same through mastering.
Now after all this blathering, I agree with Atmasphere as to the general utility of an absolute-phase switch in a home reproduction context, if maybe for different reasons. I feel that there are very few reasons not to design equipment or wire a system so that absolute phase is maintained. But how much it matters is one of those old, unsolveable audio debates.
The absolute phase of any loudspeaker through the midrange and treble can be thought of as arbitary, as connection polarity, driver response, and crossover designs vary considerably. And in lower frequencies, the interaction between the cabinet/driver/port tuning and room trumps everything else. Maybe for headphone listening there's some validity for phase absolutism, but through loudspeakers, it's a "flip to your taste" kind of thing, if you care to take the time . . . which I personally don't.
Kirkus is right in that in some ways the phase is preserved- but I think if one looks into the equipment the interest of the designer will not be seen to preserve the absolute phases of the inputs.
I've used a lot of boards over the years and serviced them as well. What I have found over and over is that while they maintain certain standards, for example pin 2 of the XLR might phase non-inverting, that they are not so interested in what the ramifications of that fact is beyond the idea that all the channels get the same treatment.
IOW the unit may well be phase inverting from input to output, but no provision for that is guaranteed by assuming that pin 2 of the XLR is indeed noninverting relative to the input. It can only be assumed that the relationship will the same with *all* the inputs. Its a tricky nuance!
Some equipment uses a modification of the original balanced standard, in which pin 2 is non-inverting. Sometime in the 70s or 80s, European equipment went to pin 3 non-inverting. This practice has shown up in some Japanese equipment as well. This stuff is all over the industry! Unless someone has taken the time to make special cables that convert from the pin 2 convention to the pin 3 convention, the result is there is simply no way to know what is up.
Since it is reasonable at this date to assume that this equipment is everywhere peppered through the industry, its very safe to assume that 50% of all recordings are out of phase and the other half is in phase.
Having been a recording engineer for many years we very carefully checked all wiring of microphones and consoles to be phase correct. But we are really talking here about keeping relative phase consistant. The absolute phase of the end product can be reversed and will in all likelyhood be undetectable. Remember to mikes wired for correct phase when placed near each other will produce out of phase information due to acoustical leakage. In mastering we used osillascopes to check for any severe out of phase information particularly in the bass for lp tracking issues. Relative phase and absolute phase are two different things
Some equipment uses a modification of the original balanced standard, in which pin 2 is non-inverting. Sometime in the 70s or 80s, European equipment went to pin 3 non-inverting. This practice has shown up in some Japanese equipment as well. This stuff is all over the industry! Unless someone has taken the time to make special cables that convert from the pin 2 convention to the pin 3 convention, the result is there is simply no way to know what is up.Atmasphere raises a very important point here, that is the difference between the "American" (pin 2 hot) and "European" (pin 3 hot) XLR pinouts. The "American" pinout is the EIA/AES specified professional standard, but this is of course not at all consistent across different types of gear, i.e. microphones are virtually always wired with pin 2 hot.
I didn't mean to imply that recording and mastering engineers in general give any special attention to absolute phase, but as others has pointed out, relative phase is absolutely critical. So in practice, when outboard gear in the studio (i.e. a compressor, mic preamp, effects unit, etc.) is wired to the patchbay or console, any potential polarity reversals *should* be corrected. Otherwise, this can cause some really weird issues when the output is brought back into the console for mixdown, or routed to headphones on tracking.
So a thorough, necessary attention to relative phase is highly necessary, and absolute phase much of the time just comes along with the ride. This is especially true given the ubiquity of Pro Tools . . . when one purchases a CD that's digitally recorded, mixed, and mastered, it's a pretty good bet that a positive pulse on the vast majority of input channels' (microphones') ADC corresponds to a positive pulse on the output of your CD player.
Unfortunately, some CD players invert polarity and some don't. Some other electronic components invert polarity and some don't. Some speakers have improperly wired driver(s). And cabling is sometimes connected incorrectly, L to R.
Thus, even if a given CD has "correct polarity" the resulting sound can be either correct or reverse polarity. It all depends. Just one more reason why, for a given disc, there's a 50% chance the sound out of the speakers will have polarity reversed.
The best laid plans of mice and men.....
Geoffkait, which cd players invert? Why? What speakers are improperly wired? I know some inexpensive speakers invert the midrange driver.
Why would hearing a difference not be the ultimate test?
With all the testimony that recording engineers do pay attention to polarity, would your chances be 50/50 for having a reversed polarity, and why would this matter to seeking to find the proper polarity?
Tbg - dunno how many or which players invert polarity. I dunno why either, but I don't think they do it on purpose. Maybe they are like recording engineers and just donl;t think about it.
Some speakers must be made on Monday since occassionally you find a speaker with one or more driver miswired. I strobly suggest checking for correct polarity of drivers, sep. if you exhibit any equipment at a show.
"All the testimony regarding recording engineers?" You mean the same meticulous recording engineers that compress the sound?
Tbg - dunno how many or which players invert polarity. I dunno why either, but I don't think they do it on purpose. Maybe they are like recording engineers and just donl;t think about it.There are three reasons why a consumer audio product would invert phase:
1. Ambiguous or no standardization on hookup of balanced interconnections (i.e. the XLR pinout conventions that Atmasphere mentioned)
2. Error in design, manufacturing, or nomenclature
3. Preference for circuit design topologies/combinations for which inversion is a side-effect. An example would be a two-tube line preamp where the first stage is a plate-loaded voltage amp, and the second is a cathode follower . . . the signal will come out inverted, unless another stage is added to flip it back around. These types of circuit designs are increasingly eccentric and anachronistic in modern equipment . . . and pretty unusual even in the field of enthusiast audio where eccentricity and anachronism is widely respected.
Reason #3 is especially uncommon in CD players and DACs, given that the overwhelming majority of high-quality DAC chips have balanced current outputs and/or on-chip signal inversion capability . . . meaning that the designer can just as easily preserve signal polarity regardless of the design of the output stage. That's why I used a CD played from a Pro Tools recording/mastering process as my "good bet" example.
I have a Manley Steelhead which I run full tilt into a Lightspeed Attenuator and then into a pair of mono blocks.
I assume everything is phase correct.
Cd is run directly into the LSA and to the amps.
To my ears the cd sound is much fuller,which bothers me, because the lp sound is thinner by comparison.
I am thinking there maybe a phase issue with the phono stage added to the mix.My memory is going, but I seem to remember somewhere in the past that when you have 3 stages, it puts the system out of phase and you need to switch at the speakers.In my case I will try reversing the leads on my cartridge as it is less full and perhaps out of phase.
Any thoughts on this?
Lacee- I'm not able to answer your technical question about whether the
number of stages causes some sort of phase inversion, but I can tell you,
based on my experience running a Steelhead for a number of years that:
1. It sounds better if power is always on to the power supply and you are
simply switching off the active circuits using the standby switch on the main
2. I preferred the sound of the unit through a separate active line stage,
rather than running it with its own buffered volume control. Whether that's
directly comparable to what you are doing by running the Steelhead fixed
outputs into a passive line stage, I dunno, but I offer it for what it's worth.
The unit sounded very 'clear' but a bit threadbare running it on its own
(without the addition of the separate line stage, which added more meat to
call it 'warmth,' 'richness' or 'euphonic coloration,' it had a less 'forced' more
relaxed sound). At the time i was running a Lamm L2 which may have been
very complementary to the Steelhead, b/c the Lamm had a very lucid, if
somewhat 'dark' sound and rolled off a bit at the frequency extremes.
3. Using Lyra Titan i and later Airtight PC1, I preferred the sound through
the MM inputs run at 47k. The MC inputs sounded uneven across the
spectrum. I did fiddle with the loading and other settings, but that's where I
4. I think the gain setting on the unit was at 55.
5. I found that tube rolling made a pretty significant difference- my
preference being the NOS Tele equivalents and running other NOS for the
7044 slots. On that note, it may be worth changing out the tubes to see
what difference it makes.
I did a fair amount of fooling around with the unit when I owned it- and I
think that summarizes where I came out.
I see a black ground wire going in the bottom of the tonearm, but I don't know where it is connected to.
The black wire is connected to the aluminum arm wand by the screw that hold the arm wand into the pivot assembly. At the arm board PCB it is connected to the green tonearm wire, the ground return of the red interconnect as well as the ground wire. The blue wire is only connected to the ground return of the white interconnect and nothing else.
I hope this helps. If needed I can upload a photo of the arm board to my system page. Let me know.
I sort of figured things out.
Reading past reviews, all seemed to say that running the Steelhead in the fixed output mode into a pre amp( LSA in my case)would be better sounding, so using it as a phono stage minus the volume control is what I've settled for also.
I was running it the other way, because you can't access the mono feature in fixed mode, and I occasionally listen to mono recordings.
Also, running the cartridge(Clearaudio Talisman V2 gold)thru MM stage and not the transformers in MC also has sonic merits.Experimenting with loading, so far 100 ohm sounds good, less cartridge ringing.
So now I am quite happy,thanks for the input.
Also, running the cartridge(Clearaudio Talisman V2 gold)thru MM stage and not the transformers in MC also has sonic merits.Experimenting with loading, so far 100 ohm sounds good, less cartridge ringing.
I guarantee that the load had no effect on ringing! If you look at the inductance of the cartridge, you will see that it is so slight that the load could not possibly affect it at audio frequencies.
The reason it is making a difference has to do with the apparent fact that the Steelhead is sensitive to Radio Frequency Interference (RFI) at the input of the phono. The RFI is generated by the resonance of the inductance of the cartridge, in parallel with the capacitance of the cable- the two form a tuned RF circuit. The energy of the cartridge sets this circuit into resonance- and that is the source of your RFI. By adding the load, you are detuning the RF circuit so it can't resonate.
Thanks Ralph for the clarification.
It's great to have the real reasons behind what we are hearing explained.
Noise is something that you never know you have until you do something that lessens it,hence I will be doing more experimentation with loading and capacitance eventhough it's a MC.
What I can say about my experience with loading and the Talisman cartridge was that 100 ohms was a bit rolled off but fuller sounding than running it into Clearaudio suggested 300 ohms, at least with the gear I own.
And that was with low cap Nordost Heimdals from the arm and Heimdals from the phono stage.
And things varied from recording to recording.
I'm not about to tally up what settings sound best for each lp I have, so I've settled for compromises,never the best, but unfortunatly a fact.