But we can like the MOSFET's comparative ease of biasing and the negative temperature coefficient of the best MOSFETs, i.e. the lateral MOSFETs once made by Hitachi and now made by Magnatek and others. Here is where MOSFETs show a clear advantage over the transistors, as biasing difficulties and smoking output stages limit ones enthusiasm for the transistor. When correctly biased, a comparable transistor amplifier will exhibit a lower distortion figure than a comparable MOSFET amplifier. But herein lies the rub: virtually all transistor output stages are run in a lean Class-B and thus the transistor output stage is extremely sensitive to incorrect biasing. Too little or too much bias current will lead to increased distortion. So while it is easier to produce a lower distortion transistor amplifier, it is more difficult to maintain its low distortion. The MOSFET, on the other hand, is less sensitive to shifts in the bias voltages, as it has a much lower transconductance. Furthermore, MOSFET output stages always require a much higher quiescent idle current, which increases the percentage of Class-A operation in Class-AB mix, which is all for the good, as the first watt of power is sonically the most important watt. Add to these benefits the advantage of virtually no thermal runaway, and the MOSFET appears to be the clear winner. But if the output stage operates in a pure Class-A mode, then the transistor may prove a better choice, as a DC servo loop can maintain both the correct bias voltages and work to slow down runaways. Or if sufficient emitter resistance is added to effectively lower the transistor's transconductance to the level of a MOSFET, the transistor will certainly be more linear. (I would love to build an amplifier that used all three devices in parallel in the output stage: tube, MOSFET, and bipolar transistor.)
|