Results 1 to 1 of 1
  1. #1
    Forum Regular
    Join Date
    Sep 2016
    Posts
    19

    Why do audio op amps require such high rail voltages?

    This might be a silly question, but I haven't been able to find it addressed directly anywhere on the internet yet. I also have a handful of related questions in-line, which I hope don't stray too far off-topic.
    In pro gear, Line-level audio signals are roughly ~3.5V peak-to-peak, so why do audio circuits routinely require or recommend rail voltages of +/- 12v or higher?
    Is this purely a headroom thing? Or do non-linearities in op amps depend on the supply voltages?
    Or to support cheaper components? Looking at the datasheet of the TL072, the maximum output voltage can be as low as 2/3 of the rail if the load resistance gets low (2k Ohm), but is typically at 90% of the rail for a 10k Ohm load. But, you could also use a higher-end op-amp that's rail-to rail?
    The main thing that prompts this question is looking at the datasheet for the Cirrus CS4272 and the schematics / data on the evaluation board. In that case, even though the ADC operates from 0v to 5v, they still opt to use a bipolar +/-18V supply for the input buffer. In that particular example, they're using the NE5532D8, which has a worst-case output swing of 80% of the rail, and supports rails as low as +/- 3v.
    So, why would they use +/- 18V supplies if the ADC only supports 0-5v audio (presumably biased around 2.5v), and using a +/- 3v supply would still easily accommodate the 3.5V peak-to-peak range?
    According to the datasheet, there is also no scaling (gain or attenuation) happening in this circuit:
    XLR connectors supply the CS4272 analog inputs through unity gain, AC-coupled differential circuits. A 2 Vrms differential signal will drive the CS4272 inputs to full scale.

    So any signal that's over line level would wind up getting clipped by the ADC anyways. Is it better to have the clipping in the ADC vs. the op amp? Or is the higher rail required for the output stage, even though it would still only provide a ~3.5v peak-to-peak line-level output signal?
    In the context of driving a 5v single-supply ADC, what are the reasons that using an input stage with higher, bipolar supplies is better than using something like an LT1215 on a single-supply at 5v?
    Thanks!

  2. #2
    Phila combat zone JoeE SP9's Avatar
    Join Date
    Jun 2003
    Location
    Philadelphia, PA
    Posts
    2,710
    Power supply input/driver voltages have absolutely nothing to do with signal voltages. It's the choice of the designer.
    ARC SP9 MKIII, VPI HW19, Rega RB300
    Marcof PPA1, Shure, Sumiko, Ortofon carts, Yamaha DVD-S1800
    Behringer UCA222, Emotiva XDA-2, HiFimeDIY
    Accuphase T101, Teac V-7010, Nak ZX-7. LX-5, Behringer DSP1124P
    Front: Magnepan 1.7, DBX 223SX, 2 modified Dynaco MK3's, 2, 12" DIY TL subs (Pass El-Pipe-O) 2 bridged Crown XLS-402
    Rear/HT: Emotiva UMC200, Acoustat Model 1/SPW-1, Behringer CX2310, 2 Adcom GFA-545

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •