Why do audio op amps require such high rail voltages? [Archive] - Audio & Video Forums

PDA

View Full Version : Why do audio op amps require such high rail voltages?



jolin
02-21-2017, 10:12 PM
This might be a silly question, but I haven't been able to find it addressed directly anywhere on the internet yet. I also have a handful of related questions in-line, which I hope don't stray too far off-topic.
In pro gear, Line-level audio signals are roughly ~3.5V peak-to-peak, so why do audio circuits routinely require or recommend rail voltages of +/- 12v or higher?
Is this purely a headroom thing? Or do non-linearities in op amps depend on the supply voltages?
Or to support cheaper components? Looking at the datasheet of the TL072, the maximum output voltage can be as low as 2/3 of the rail if the load resistance gets low (2k Ohm), but is typically at 90% of the rail for a 10k Ohm load. But, you could also use a higher-end op-amp that's rail-to rail?
The main thing that prompts this question is looking at the datasheet for the Cirrus CS4272 (http://www.kynix.com/uploadfiles/pdf0125/CS4272-CZZ.pdf) and the schematics / data on the evaluation board. In that case, even though the ADC operates from 0v to 5v, they still opt to use a bipolar +/-18V supply for the input buffer. In that particular example, they're using the NE5532D8, which has a worst-case output swing of 80% of the rail, and supports rails as low as +/- 3v.
So, why would they use +/- 18V supplies if the ADC only supports 0-5v audio (presumably biased around 2.5v), and using a +/- 3v supply would still easily accommodate the 3.5V peak-to-peak range?
According to the datasheet, there is also no scaling (gain or attenuation) happening in this circuit:

XLR connectors supply the CS4272 analog inputs through unity gain, AC-coupled differential circuits. A 2 Vrms differential signal will drive the CS4272 inputs to full scale.

So any signal that's over line level would wind up getting clipped by the ADC anyways. Is it better to have the clipping in the ADC vs. the op amp? Or is the higher rail required for the output stage, even though it would still only provide a ~3.5v peak-to-peak line-level output signal?
In the context of driving a 5v single-supply ADC, what are the reasons that using an input stage with higher, bipolar supplies is better than using something like an LT1215 on a single-supply at 5v?
Thanks!

JoeE SP9
02-24-2017, 02:25 PM
Power supply input/driver voltages have absolutely nothing to do with signal voltages. It's the choice of the designer.