You are not logged in.

#1 2015-06-11 20:35:32

New member
Registered: 2012-10-05
Posts: 3

Biasing a '70 SVT


Crossover distortion after biasing to factory specs.

In my shop is a 1970 SVT in for a re-tube and checkup.  This was an old 6146B head which was eons ago converted to 6550 (presumably successfully...the previous owner was a touring pro).  The tubes were a mishmash of Russian, Chinese and one old Tung Sol.  I'm installing a matched sextet of Russian "TungSols". 

The problem is that when I use the front panel sense jacks and adust each bank for the specified 75mV TOTAL cathode voltage, I get TONS of crossover distortion and only 100W at clip  I understand that 75mV was also the figure used in the early 70s 6550-based SVTs as well, and those also used a shared 1 ohm sense resistor. 

FWIW, the tubes fairly well matched as confirmed by measuring the voltage drop across the 10 ohm plate resistors. 

Is 75mV REALLY a realistic value?  That translates to a mere 25mA idle current per tube.  Seems pretty low for a 6550 (shouldn't it be 35-40mA at 680V) 

BTW, if I up it to 34mA/tube, I still see plenty crossover.  I don't think it's necessarily my tubes; I saw the crossover distortion with the previous tired tubes as well.

Caps are old but I'm not seeing ripple at clipping, and when I crank it up to clipping, plate voltage isn't really sagging more than 5%, so filter leakage probably isn't the issue (although those caps are probably gonna be replaced anyway).  The 12BH7 drivers are testing well on my TV7.

Do I just ignore the 75mV spec on the sense jacks and go for minimal crossover?  I certainly don't want to fry any iron!

Thanks in advance.



Board footer

Powered by PunBB
© Copyright 2002–2005 Rickard Andersson