Best Preamp = No Preamp?


I'm currently looking for some DACs. I'm looking at Benchmark DAC1, Bel Canto DAC3, Slim Devices Transporter, etc...

I noticed most of these newest high performance DACs have built in volume control with remote.

I'm thinking that I can connect these DACs directly to my Power Amp skipping preamp.

Is that right thinking? Why go through additional peice of device when I can avoid? Anybody doing it that way?

What'll be the pros and cons?

eandylee
Most SS sources (excluding tube ones) have an output stages that can equal and sometimes better many preamp output stages, especially tube ones.
So the myth that a preamp can drive the interconnects to the amp better is a "furphy", and started by preamp manufactures, and "Ohms Law" will prove that time and time again.
The only time you may like a preamp in the way is if you prefer it's colouration it gives, at a cost of transparency.

The second statement in the above post is not supported by the first statement. The reason to go with a preamp, or to go without one for that matter is to get rid of coloration.

So which is it?

The problem is *not* whether a preamp can do a better job driving an interconnect cable than a source can, the problem is actually can the source control the interconnect cables in such a way that the cable *after* the passive volume control (the one that connects to the power amp) won't contribute artifact.

If you have ever wondered why a passive only sounds right with the volume all the way up, this is why. The source can control the interconnects within its abilities if the passive is turned all the way up. As you turn it down, the source impedance is put in series with the volume control impedance (Ohm's Law) and the various construction foibles of the cable after the passive can then come into play. Generally this results in less bass impact and a sense of lacking dynamics. Due to the increased impedance the resulting sound is at the mercy of the interconnect cable.

Those successful with using passives will instantly know what I am talking about- as they have found that the choice of cable is pretty important!

For this reason it is better to have a low source impedance driving the cables. This reduces the effect of the cable as the source impedance is a shunt across the resistive/capacitive/inductive effects of the cable and thus reduces their influence. Now if your source has a good quality volume control then you are all set- but if it doesn't, you will need a good volume control to do the job and to do it right, one that is buffered from the interconnect cable such that the volume control's resistive value can't interact with the cable (and the input impedance of the amp).

Another solution is to have the volume control built into the power amplifier.

This is all just Ohm's Law so far.

Now if you happen to have more than one source you have a bit of a problem without a preamp. Some DACs solve the problem by having multiple analog inputs, but if you have been in this sport for any length of time you know how quickly digital technology is changing. So a preamp with a proper volume control and a low output impedance (so it can minimize the effects of your interconnect cable) can be a good investment and quite often one that reduces coloration rather than increasing it, on account of the reason's listed above, all having to do with Ohm's Law. Unlike human laws, Ohm's Law cannot be violated.

Historically the idea that the interconnect cables can be controlled comes from the recording industry and industrial electronics. It has only been in the world of home audio where higher impedances tend to be the name of the game that audible cable differences have cropped up (again, Ohm's Law) due to higher impedance impedances in amplifiers and often higher output impedances in source components. It should be no surprise that in such situations the addition of a passive control between the source and the amplifier will result in coloration. The way the recording industry got rid of such colorations (and BTW they often deal with very long interconnections so cable coloration can be a huge problem) is by operating with low impedances, often 1000 ohms or less (600 ohms was for a very long time a standard for line level connections; very few high end audio sources or preamps for that matter can drive 600 ohms although there are a few that can).

We all listen to LPs and CDs generated by this process; we thus all know that the recording industry's technique works. Its not an exclusive; the same technique works in the home as well.
As the original OP's question asks. "Best Preamp = No Preamp?
"I'm thinking that I can connect these DACs directly to my Power Amp skipping preamp"

Like I said, and I'll say it again. Most SS sources in direct connection to the amp/s these days will drive the interconnects and amp/s as good if not better than some active preamps, especially tube preamps.
As todays SS sources have buffers with output impedances low enough and enough voltage output to drive any interconnect and amp. It's OHMS LAW.
And then yes, "the best preamp is no preamp".

And not only do you get rid of the colourations of preamp in the signal path, but also an extra set of interconnects.

Cheers George
George, did you know our preamp can drive 32 ohm headphones directly? My speakers at home are 16 ohms and it can drive them too. Yet it is a tube preamp. The line stage is a miniature power amplifier. It has a low output impedance that is the same at 10Hz as it is at 1000Hz. It is this low output impedance that allows it to control interconnect cable artifact, and is why it is more transparent than passive controls.

During the golden age of stereo which occurred from about 1958 about 1963, many of the best jazz and classical recordings were made. Yet at that time, there were no high end audio cables made; there was no high end cable industry at all until Robert Fulton offered his first cables in 1977. Yet these recordings are amazingly transparent. This despite the fact that the interconnect cables were often over 200 feet long! It was the design of the tube electronics involved in the recordings that prevented the cables from imposing artifact.

Such would have been impossible with a passive volume control.
Just to add a different perspective, if I am not mistaken the 600 ohm standard was originated by the telephone industry (good old Ma Bell) back in the day and adopted by the professional audio industry. Seems the phone company knew the benefit of this as it applied to long cable runs as well.

I'll say it again,
Most solid state sources these days have very low impedance output stage buffers that can drive anything, and are as good if not better than many preamps and without the colouration, especially tube preamp.
A preamp that can drive 32ohms, that does not make it a necessity to have in the signal path, keep it as a headphone amp. As most solid state sources will drive anything they see today.

Cheers George
A lot of the best recordings made in the "golden age" were so in large part to good engineers using better recording techniques with little concern about dumbing down the sound to the lowest commercial denominator.
Actually the reason the records produced during the Golden Age sound so freaking good is because they were still recording and mastering with tube electronics, including tube microphones. Hel-loo! It was in the seventies when the industry switched over to solid state that everything went to hell.
The golden age was late fifties right after high fidelity stereo debuted and was a brand new big deal. Like most things it's mostly taken for granted these days. When is the last time anyone ever referred to high fidelity sound? Oh yeah now it's high end and uber expensive. Fact is most audio people listen to today is quite hifi at various price points whereas it took something really special to make it happen with the technology available to consumers in those ancient times.
^One could argue that many engineers go out of their way to make many of today's pop recordings intentionally ultimately bad.
When Mercury Records recorded at Northrup Auditorium in Minneapolis, to do so they parked their recording truck behind the building and ran about 200 feet of cable from the mics to the recorders mounted in the truck.

You can read about their truck if you have some Mercury recordings with the original inner sleeves. The reason they were able to do this without loss of fidelity had to do with the balanced line system as I have described.

Certainly the tubes helped the situation, but one can produce excellent recordings using solid state using the same techniques- and in the same way the cables will contribute no artifact.

Cable artifact is a phenomena of home audio systems, because instead of using known, established engineering technique to solve the problem, most audiophiles simply throw money at the problem by buying more expensive interconnects and listening to the differences between them to try to find one that works. But what if you had a system wherein the cable really didn't make a difference and instead just always sounded like the best cable made (or even no cable at all)? Would that be interesting? That system has been around for over 60 years...
That's what audiophiles do.

Doing it like the pros makes too much sense. It would take all the fun out of things.