Running DCS Vivladi DIRECT?


Hey Folks,

Anybody out there who cares to comment on running the Vivaldi DAC direct  to the Power amp.

Please compare with running through your favorite preamp and elucidate the differences.

Thanks & keep enjoying our hobby!

Ag insider logo xs@2xsthekepat
Why match? To avoid having to remember to add or cut 20-30 steps on the volume control or risk playing way too loud. As it stands the two inputs use the same range which is what I want. Also we want to avoid at all costs using any gain at all in the pre-amp and reducing to 0.2V would mean needing to use almost the full gain stage (12dB) to get 0.7V rated output (not that I of course drive the amps that high but peaks may) while 600mV is much closer to the desired 0.7B output sans gain ... make sense?

In other words the volume control setting and cuts I describe represent the average, the peaks may be close to unity


Plus I should add that the test tone I used for level setting is -20dBFS so if you like the peak you are actually hearing with 14dB cut is +6dB

This all gets confusing which is why I always keep my gain cascade spreadsheet handy and check all the levels so that I get as close as possible to 0dBFS at power amp input sensitivity at the level of preamp gain I prefer

If you attenuate between 9.5dB and 14dB on the Vivaldi with the 0.6V output setting going direct, running 0.2V into the preamp won't need its 12dB gain at all.

Using your preamp you first attenuate the signal from the source and then amplify it back to the level required for your listening anyway, since you're forced to apply the preamp gain to the signal no matter what. (the gain stage in your ARC preamp is fixed not variable like in Ayre's preamps btw)

If you have a TT you need a preamp anyway and it's irrelevant to the discussion what output setting you prefer on the Vivaldi through the preamp for various reasons; the discussion is about differences between DAC direct vs preamp and I think the Vivaldi direct should be used in its optimal configuration to the amps in question, for the result to be as meaningful as possible.

My take is that if you wouldn't prefer the Vivaldi direct on its 0.2V output setting to the preamp route, you don't really like how the DAC sounds and prefer the coloration/enhancement that the preamp gives you.
I’d try the 0.2V output setting on the Vivaldi if you attenuated between 9.5dB and 14dB; this way you’d be able to run the volume control to max, with no more than 5dB of attenuation.

Yes seigen has the right idea, this way you run minimal risk of "bit stripping" Wadia and ML know about it.

Here is what Wadia says with their digital domain volume control products, also Mark Levinson has the same instructions with theirs.
As they both have gain setting links on their analogue output buffers, so this is level preset to allow the sources digital volume control work at near full volume with no "bit stripping" https://ibb.co/kc4OCo

Also a very true and correct statement by seigen with a band-aid fix at the end.
My take is that if you wouldn’t prefer the Vivaldi direct on its 0.2V output setting to the preamp route, you don’t really like how the DAC sounds and prefer the coloration/enhancement that the preamp gives you.


Cheers George
.
Will you guys please get of your hobby horses and do some simple maths.

My amps rated sensitivity is 750mV. The 600mV output of the (0dBFS) of the DAC preserves as much of the potential dynamic range in the signal without needing either attenuation (acceptable but undesirable) or amplification (always and in every case undesirable)

When using a test tone at -20dBFS my preferred volume settings were in the range of -9 to -14dB -- now no one listens to test tones, we listen to music with dynamic range -- in the case of the dynamic range database the peak to median level for the Stacey Kent track I use is 14dB (which by the way I play at the -9dB setting) -- if we assume (which is wrong) that the test tone represents the median (it doesn’t as test tones tend to need to be played lower than music) then the setup I have corresponds to peaks almost exactly hitting unity gain

I know you people believe that in all circumstances a preamp is bad but none of you have offered any explanation as to why the direct connection appeared to limit dynamic range and cause the peaks in the tracks (especially vocals) to harden?

ps interesting that only now do you talk about bit stripping .. of course by using the dCS at 0dB this problem is avoided, plus I understand that the implementation of the Vivaldi DAC avoids this issue ...