Tape Machines > Tape Project Machines
First Tape Project by Bottlehead RS1500
classicrecordings:
Is there enough gain in the output of the head amp to drive power amps without having to go into a line stage?
David
docb:
The single ended output of the Tube repro is the -10dB standard for home audio systems and the balanced output is studio standard +4dB. The built in volume trim only affects the balanced output, i.e., there is no built in level control from the single ended output.
classicrecordings:
Thank you for the reply, but I read the same information in your previous post; which is why I asked the question.
I am not a technician, but to me, the db ratings are attenuation ratings, but without an output reference point. I would like to know if the head amp will drive my present power amps to full output when connected directly to the head amp. The only way I can figure this out is to know what the voltage output from the tape head amp is.
I hope I am asking this correctly.
Can you tell me:
* what the rated voltage output of the tape head is in mV?
* does this signal drive the tape head amp to full output?
* what is the single ended output in mV of the head amp at that rating?
* what is the balanced output in mV of the head amp at that rating?
And lastly, can the output trim pots be rewired to operate the single ended outputs?
Thanks in advance for your patience.
David
http://www.classicrecordings.on.ca/
docb:
Hi David,
Basically what you are asking is if the tape repro amp can also be used as a control amp (line stage) for consumer gear through the single ended output. The simple answer is, that's not really what it is designed for.
There are indeed reference points for these dB values. The equivalent voltage of +4dBu is 1.23 Vrms into an open load. This is the theoretical ideal output level that you will measure when a VU meter placed across the balanced output would read a signal from tape of 0dB. So of course the preamp will put out a maximum signal even higher than that on a tape with, say, +3dBVU peaks.
The -10dB figure is actually -10dBV rather than dBu. This works out to being 11.8 dB lower than +4dBu, or .316 Vrms. This .316 Vrms would be measured at the single ended output when one would read 0dB on a VU meter which is connected across the +4dBu output. I know, all these different forms of dB units are very confusing!
Now you have two rms voltages to work from to determine whether they will drive your amp to maximum output. Next you need to find out what the input sensitivity of your amp is. If it's a solid state consumer type amp with single ended inputs the sensitivity is going to be such that you will get maximum output from the amp at around .316Vrms input level - or probably even lower signal levels than that since a lot of SS amps are crazy sensitive. If it's something like our Bottlehead tube amps the input sensitivity might be a bit less, like max output requiring about .7Vrms input - which would still be quite reasonable connected to a -10dBV source.
If you are using an amp with a balanced input, in theory it should be set up to easily reach maximum output from a +4dBu input signal and everything should match up nicely. In practice you may find that some amps that have both balanced and single ended inputs are super sensitive at the balanced input and would require a fair amount of attenuation of the input signal to keep from overdriving the amp. This can be accomplished with the trim pots on the repro amp, but it would be rather awkward if you play tapes of different levels, as the trimmers are design to be adjusted thru the front panel with a screwdriver so they don't get knocked around once set.
The repro amp circuit is direct coupled between the first two stages that are ahead of the single ended output. So it is not possible to insert an attenuator ahead of the single ended output stage. This leaves the possibility of having an attenuator after that SE output, but ahead of the amp. Because of the relatively high output impedance of the SE stage an attenuator for this output really should be installed at the input of a line amp (or your power amp) end of the interconnecting cable, not at the SE output of the preamp. That interconnect is best kept short, and the impedance of the attenuator should be at least 15K ohms or higher.
PJ:
Just a quick note here on signal levels, a most confusing subject and a pet peeve of mine.
Conventional wisdom is that magnetic tape has a headroom of about 14dB, which is a factor of five times the voltage. So if a recording actually pushes the limits of the tape, you can expect peaks of five times the nominal, "zero-VU" level. That means about 1.6 volts RMS peaks from the consumer output (RCA jack), and about 6 volts RMS from the balanced XLR output.
(Note that digital media are usually specified relative to a full-scale signal, i.e. with zero dB of headroom. The nominal CD player output of 2.0 volts RMS full-scale is very similar to the peak 1.6 volts RMS mentioned above. I find this difference in references the most frustrating and crazy-making aspect!)
The above assumes the tape is in fact calibrated for this traditional signal level (usually expressed in nanowebers per meter) that corresponds to 14dB headroom, which varies with tape formulation, equalization, and frequency. It also assumes the full 14dB headroom was used in the way the tape was mastered. Perhaps Paul S. will comment on these aspects for the Tape Project tapes?
Bottom line - there is enough level to drive most power amps, but you still need a level control external to the head amp, either a line stage or a power amp with built-in level control.
By the way, I have been working on a white paper on this subject, for Bottlehead. I'd be glad to email a copy of the current draft to anyone interested.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version