Results 1 to 6 of 6
  1. #1
    AR Newbie Registered Member
    Join Date
    Jul 2004

    Measuring Amplifier Output

    I am writing for advise about measuring Amplifier Power Output.

    I want to find out where I might have gone wrong when I measured the Output Power of an amplifier, and got odd results when compared with the specification.

    I will not mention the Model or Make of the amplifier, as I do not think this is relevent and am measuring the output out of my own interest and curiosity for a better understanding.

    The Only Output specification (from a service manual) I have available to me is written as follows.

    With 8 ohm loads, both channels driven from 20 to 20KHz, rated 120W per channel minimum RMS power, with no more than 0.6% total harmonic Distortion from 250mWs to rated power.

    The input sensitivity is not metioned.....only Input S/N= 96dB to CD, Tape, etc...inputs

    Test Equipment In Use

    1. Hameg 60MHz HM604 Oscilloscope
    2. Leader LAG-120B Audio Generator
    3. Leader LMV186A 2 Channel AC Millivoltmeter
    4. Audio Test Disc (with many test tones of varying levels)
    5. Wire wound 8 ohm resistors X 2.

    MY Method of Measurement

    1. Input a 1KHz stereo tone at 0dB, from a test CD disc.
    2. Turned the volume to minimum.
    3. Connected the amplifier outputs to 8 Ohm Dummy resistors.
    4. Connected my Oscilloscope across the resistors.
    5. Connected an RMS Voltmeter across the resistors....on 100V range.
    6. Turned up the volume gradually until the sinusoidal waveform clips and then backed it off to the point just before clipping.
    7. Took a note of the volt meter reading.....50V RMS (Scaled in RMS).

    I then used the formula for power......P = V squared / 8 ohms.....2500/8 = 312.5 Watts.

    This result is well over the 120W RMS quoted per channel in the specification ........
    Where do you think I have gone wrong?

    The 8 ohm wire wound resistors are very old large High Wattage wire wound type....would this be a problem as it's not a true resistance....maybe the impedance would be a problem?

    I also used the Audio Generator with 1 Volt / 2 Volt 1KHz outputs but stll got odd readings .....far off the mark to give a 120Watt mimimun.

    I would appreciate any help that could be povided here.

    Dermot Hayes

  2. #2
    Forum Regular
    Join Date
    Feb 2002
    First of all, the rating is the minimum you should obtain using the FTC method.
    1. run the amplifier with both channels driven simultaneously at 1/3 power for 20 minutes.
    2. The rating is for a specific bandwidth and distortion with both channels driven. Many amplifiers will put out far more power at midrange frequencies than at frequency extremes, especially at low frequencies because of the voltage drop due to the internal impedence of the power supply which is a function of frequency.
    3. You understand that when the resistor gets hot, its impedence can change. Get another meter and measure both the current and the voltage and multiply them. Use a precision shunt for an ammeter or you can use a clamp on ammeter. This will tell you the actual volt*amps being delivered. For greater precision, get a dual trace scope and look at the voltage and current waveforms (from the ammeter output and across the voltmeter) on the scope. If they are not exactly in phase, multiply the volt*amps by the cosine of the angle between them which should be a decimal near but below one. This phase difference if it exists is caused by the fact that the load is not purely resistive and may have an inductive component to it. Don't forget to be sure that you are looking at the RMS and not the peak voltage and current ratings. Some amplifiers are simply rated by the manufacturer very conservatively. Let me know how you make out. Good luck.

  3. #3
    AR Newbie Registered Member
    Join Date
    Jul 2004

    Power Amplifier Measurement...Further Questions...

    Hi there Skeptic,

    Thanks for your reply.....can you offer a few answers to the following questions....please ignore any that sound just the way I think!!!!

    1.FTC Method...I can't rack my brain what this means?

    2.1/3 power you mean adjust the volume to about a 1\3 of it's full range or are you being more accurate than this here?

    3. When I measured the output power on the scope.....I was looking at the get the RMS do I multiply by 0.707V?

    4. I connected a Fluke Digital meter in series with one load.
    I expected to get I = P/V
    120/50= 2.5 amps or there abouts.....but my meter did not register anything!!
    What am I overlooking here?

    5. What type of resistor should I be using for this measurement.....material and power

    6. Thanks for your tip on how to identify the reactive component of the load resistors by
    comparing the phase difference....I think it's back to basics for me!!!

    7. Do you think the measurement I described in my first posting sounded inaccurate?

    I Know it's alot of questions, but I am just trying to get a perspective on whats happening.



  4. #4
    Forum Regular
    Join Date
    Feb 2002
    Dermot, the FTC method was introduced in the mid 1970s to create a uniform method for measurement. 1/3 power means just that and has nothing to do with the volume control setting. In your case, the power level should be 40 wpc rms on the first attempt. The 20 minute preconditioning requirement was intended to avoid short term data that doesn't represent real life use. Vrms = .707 V peak for a sine wave, that is correct. The fluke current meter may not respond to 1000 hz sine wave. These meters are often used at 60 hz. check the meter manual and the settings. Did you remember to connect the probes to the proper terminals for current (it's easy to overlook the obvious.) Check other current readings in other circuits to be sure that the meter can perform its current measurement function properly. Check any internal fuses. Sometimes they pop when the meter is not connected prior to turning on the circuit. Suggest you might try a precision ammeter shunt if you have one and a voltmeter or your scope for measuring current. Check to see if a clamp on meter will work. Retry your measurements at low frequencies. The higher the power rating of the dummy load resistors, the less temperature rise and therefore the less change in resistance will occur. In general, when resistors get hot, their resistance increases. That may partly explain your higher than expected reading. If 312.5 watts were actually being developed, an 8 ohm resistor would draw 6.25 amps. Were the current to be reduced to 3 amps because the resistance had approximately doubled to 16 ohms, you would only be dissipating 150 watts. This is why the current measurement is so important. Your measurement idea was accurate but you didn't take into account that the resistance might change and therefore the equation P=V squared/ resistance would be invalid because you assumed that the resistance was still 8 ohms. Also keep in mind that power bandwidth is a part of any meaningful specification. If the power bandwidth for a 120 watt amplifier is specified as down 3 db at 20 hz, it means that the nominal maximum midband power may be as high as 240 watts. If it is specified as being down 1 db, it may be expected to be only 150 watts.

  5. #5
    AR Newbie Registered Member
    Join Date
    Jul 2004
    Hi Skeptic,
    Thanks for the information,
    I'll let you know how I get on.

  6. #6
    AR Newbie Registered Member
    Join Date
    Jul 2004

    Amplifier Power Output Measurement.....

    Hi there Skeptic,

    I tried the measurement again.
    Playing back a 1KHz 0db signal.
    I don't have a precision shunt ammeter. So this time I connected one channel of the scope in series with the 8 ohm load to measure the current and the other across the same load to measure the voltage. Both were in phase so I assume from this that there was no inductive reactance present....I was just using 0db as a standard reference.

    You told me to run the amplifier for 20 mins at 1/3 power of 40 Watts.
    I didn't know how to do this as I am having a problem measuring the output power. I don't know if I am connecting the test equipment correctly or if I am using the correct method.

    Althought the specification states 120W per doesn't quote what the input signal level should be to get this measurement.

    I meant to ask...another amplifier stated output power in Watts DIN.
    What is the relationship between Watts RMS and Watts DIN?

    I have been looking on the internet for more information on the FTC method, but could not find any. I wanted to find some guide lines to see if I am connecting everything correctly.

    Any further help would be appreciated.



Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Confused about dB's, Sensitivity....
    By joel2762 in forum Home Theater/Video
    Replies: 34
    Last Post: 11-25-2009, 06:41 AM
  2. The Fisher TX 50 Amplifier
    By gatomeno in forum Amps/Preamps
    Replies: 1
    Last Post: 12-30-2004, 09:25 PM
  3. Resisting the resistance factors?
    By msrance in forum The Audio Lab, Tweaks, Mods, DIY
    Replies: 5
    Last Post: 03-19-2004, 03:00 AM
  4. wattage consumption vs. wattage speaker output...
    By vivisimonvi in forum Amps/Preamps
    Replies: 6
    Last Post: 02-09-2004, 06:25 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts