I thought about using mosfets instead (despite the negative opinions people have towards fets in audio applications)
I saw an expert state that mosfets are okay in audio amplifiers, as long as the power supply is tightly regulated. It may have been Rod Elliot's website. (Take a look at the mosfet amplifiers, a screen or two down the list.)
**broken link removed**
I then thought of replacing the diodes with resistors, doing this seems to be better solution, but I'm not sure if there's any drawbacks of using resistors instead of diodes with biasing a class AB stage that I'm not seeing?
In essence, you add some amount of DC bias to your AC signal. The idea is to apply just sufficient DC current to eliminate crossover distortion.
To make things simple I used a bipolar power supply.
Your input signal needs to overcome 0.65 V threshold in the base, thus it needs to be about 1.3 V greater amplitude than your desired output amplitude.
With careful adjustment of the pots, you should be able to obtain satisfactory operation, where the transistors are turned on long enough to minimize crossover distortion, yet without too much wasted power. At that point the transistor bases differ by 1.3 V at all times in the cycle. Then the diodes appear to be unnecessary.
Variations might include:
* a single polarity supply
* putting the NPN transistor at the lower side