DC Power Supplies Output Resistance

In summary, the effective dc output resistance of "good" dc power supplies can vary depending on the type of supply. Regulated power supplies tend to have a higher output resistance, but this does not necessarily make them less desirable. The low output impedance of a good power supply is due to complex circuitry rather than the internal resistance. The ripple on a power supply output also depends on the load and can be reduced by adding capacitance or using an external regulator. The internal resistance of a DC power supply cannot be determined by its output voltage/current rating.
  • #1
roam
1,271
12
I have a question regarding the effective dc output resistance of dc power supplies. For a "good" power supply, should this internal resistance be large or small?

Well, I used to think that a good power supply should have a small internal resistance, so it would maintain a constant terminal voltage until exhausted before droping to 0. However, I did a few experiments and found that regulated power supplies (which give a more steady outoput with less ripples), tend to have a higher output resistance than the unregulated power supplies. So does this mean the dc power supplies with larger resistance are more desirable?

I'm a bit confused. I greatly appreciate it if anyone could confirm this, and also explain briefly why higher internal resistance reduces the output ripples.:confused:
 
Engineering news on Phys.org
  • #2
"Good" power supplies have a very low output impedance.

This applies epecially to well designed regulated supplies.

You can imagine it as an internal resistor in series with the output. Since the output varies very little with load, then this resistor must be very small.

The ripple on a power supply output depends on the ripple entering the regulator from the power supply rectifier and filter.
This will have greater amplitude with more current output and the regulator will start to pass it through to the output when the minimum voltage on the ripple is less than the minimum input required for the regulator.

For example, supplose the regulator is giving 5 volts out.
It will probably need an input of at least 7 volts.
So if there is ripple that dips down to 6.5 volts, the regulator will not give 5 volts out at that time and you will get ripple on the output.
The output may drop to 4.5 volts on negative-going peaks of the input ripple.
 
  • #3
roam said:
I have a question regarding the effective dc output resistance of dc power supplies. For a "good" power supply, should this internal resistance be large or small?

Well, I used to think that a good power supply should have a small internal resistance, so it would maintain a constant terminal voltage until exhausted before droping to 0. However, I did a few experiments and found that regulated power supplies (which give a more steady outoput with less ripples), tend to have a higher output resistance than the unregulated power supplies. So does this mean the dc power supplies with larger resistance are more desirable?

I'm a bit confused. I greatly appreciate it if anyone could confirm this, and also explain briefly why higher internal resistance reduces the output ripples.:confused:
Regulated power supplies use a voltage regulator (Shocking! ) in order to keep the voltage stable. A simple regulator is an LM7805, a 5V linear regulator.

Regulators are not simple passive devices that can be understood just by measuring them with an ohmmeter. The main control element in a linear regulator is a power transistor, a non-ohmic device. Without power running to it, it's quite likely that the transistor would look like an open to an ohmmeter. The device needs power to work properly.

There is also a circuit that compares the output voltage to a reference, and uses negative feedback to stabilize the output voltage. The feedback circuit is probably a few kilo-ohms, which is probably what you measured.

Regulated power supplies are better not because they have high output impedance, but because they have complex circuitry that keeps the output voltage stable over a large range of output currents and input voltages. That complex circuitry just looks like a large resistance if you unplug it and hook the outputs to an ohmmeter.
 
  • #4
vk6kro said:
"Good" power supplies have a very low output impedance.

This applies epecially to well designed regulated supplies.

You can imagine it as an internal resistor in series with the output. Since the output varies very little with load, then this resistor must be very small.

The ripple on a power supply output depends on the ripple entering the regulator from the power supply rectifier and filter.
This will have greater amplitude with more current output and the regulator will start to pass it through to the output when the minimum voltage on the ripple is less than the minimum input required for the regulator.


I did an experiment with an unregulated power supply, and I varied the resistive load which was connected across the supply output. With the heavy load (1k ohm) I got a more smooth output, and for the lighter load (220 ohm) I got ripples with larger amplitude (looking at the oscilloscope). So, my question now is that the higher this resistance, the more linear/steady output will be?
 
Last edited:
  • #5
A 220 ohm resistor would draw more current than a 1000 ohm resistor so the 220 ohm resistor would be considered a heavier load than a 1000 ohm resistor.

With an unregulated power supply, the capacitor is only charged in short bursts and then it discharges into the load for the rest of the time.

The more current the load takes, the more the capacitor discharges and the more the voltage drops between charging pulses.

So, you get more ripple.

To fix it, you can add extra capacitance across the output and you can also add an external regulator if you like.
 
  • #6
Thank you very much, that makes perfect sense now. :smile:
 
  • #7
Can I assume my internal resistance of a DC power supply would just be the output voltage/current rating?

So if I have a 24VDC supply rated at 1.5 amps, my internal resistance would be 16 ohms. If it was less, than the supply could actually draw more than 1.5 amps right?

Thanks,
Nick
 
  • #8
roam,
I have a question regarding the effective dc output resistance of dc power supplies. For a "good" power supply, should this internal resistance be large or small?

Is it a constant voltage supply, or a constant current supply?


nlaham,

Can I assume my internal resistance of a DC power supply would just be the output voltage/current rating?

Your assumption would be wrong.

So if I have a 24VDC supply rated at 1.5 amps, my internal resistance would be 16 ohms. If it was less, than the supply could actually draw more than 1.5 amps right?

Your fuse should blow if the current exceeds 1.5 amps. A good voltage supply should regulate its internal impedance to show its set output voltage for no load up to the the max current load it can support.

Ratch
 
  • #9
nlaham said:
Can I assume my internal resistance of a DC power supply would just be the output voltage/current rating?

So if I have a 24VDC supply rated at 1.5 amps, my internal resistance would be 16 ohms. If it was less, than the supply could actually draw more than 1.5 amps right?

Thanks,
Nick

No, that would give you the minumum resistance of the load. You can have any resistor as a load as long as it does not have less resistance than 16 ohms.

Internal resistance is different.

You would measure the output voltage with no resistor then put in a resistor as a load and measure the voltage again.
Suppose a power supply had a no-load voltage of 27 volts and a full load voltage of 24 volts at 1.5 amps.
On full load, there is 3 volts being dropped across the internal resistance of the power supply and 1.5 amps flowing in it, so the power supply's internal resistance must be (27 volts - 24 volts) / 1.5 amps or 2 ohms.
The internal resistance is like a series resistance inside the power supply.

Once you know the internal resistance of the power supply, you can estimate the output voltage at other loads. For example, if the load was 1 amps, then the output would be 25 volts.
This is because there would be 1 amp flowing through a 2 ohm resistor producing a 2 volt internal drop (V = I * R = 1 amp * 2 ohms = 2 volts) which subtracts from the internal 27 volt supply to give 25 volts.
 
  • #10
vk6kro said:
...You would measure the output voltage with no resistor then put in a resistor as a load and measure the voltage again.
Suppose a power supply had a no-load voltage of 27 volts and a full load voltage of 24 volts at 1.5 amps.

Hi,
I am learning something here myself :)
so when you state ... "full load voltage of 24 volts at 1.5 amps" the 1.5A is just referring to what is written on the PSU ?
Just trying to get this clear ... you could put a range of resistors across the output as the load ... they are all going to produce different current flows. So how do I know what the "FULL LOAD" resistance is ?


On full load, there is 3 volts being dropped across the internal resistance of the power supply and 1.5 amps flowing in it, so the power supply's internal resistance must be (27 volts - 24 volts) / 1.5 amps or 2 ohms.
The internal resistance is like a series resistance inside the power supply.

OK I get the voltage drop across the internal resistance ... still the question of this specific 1.5A, because that's going to affect the value of the internal resistance

Once you know the internal resistance of the power supply, you can estimate the output voltage at other loads. For example, if the load was 1 amp, then the output would be 25 volts.
This is because there would be 1 amp flowing through a 2 ohm resistor producing a 2 volt internal drop (V = I * R = 1 amp * 2 ohms = 2 volts) which subtracts from the internal 27 volt supply to give 25 volts.

I'm just trying to consider the scenario of a PSU I build. Now I don't know the internal resistance, I don't know the specific DC current capability of the PSU ( tho working from the ratings of the transformer, I should be able to get a close figure)

Other than that, all I really know is that the PSU has a open cct voltage of say 17VDC and a load voltage of 12VDC with a 1k resistor load

Dave
 
  • #11
Hi Dave,

Yes, the current rating can refer to different things.

The 1.5 amps may be the maximum current the supply could deliver without overheating or, on poor quality power supplies, it can mean that 1.5 amps is the current that produces the correct output voltage.

Some power supplies have such poor regulation that they produce a much higher voltage off load and the only way to get a certain lower voltage is to draw a very specific current from them.

It is the current that is written on the power supply or in the catalog when you bought it.

If it is the current the supply can deliver without overheating, then any smaller current is OK provided you accept that the voltage may be a bit higher than at full load.

Generally we take the internal resistance to be constant (but don't really believe it) so that we can calculate an internal voltage drop depending on the current.

Other than that, all I really know is that the PSU has a open cct voltage of say 17VDC and a load voltage of 12VDC with a 1k resistor load

So, the power supply would drop 5 volts with a 12 mA load (12 volts / 1000 ohms = 12 mA) so it has 416 ohms internal resistance.
((17 volts - 12 volts) / 0.012 amps) = 5 volts / 0.012 amps = 416 ohms.

That would be a pretty bad power supply, but keep in mind that this does not tell you anything about ripple if this is a mains powered power supply. Mostly it just applies to batteries.
 
  • #12
Op might find it useful to print himself a copy of National Semiconductor's application note AN556, "Introduction to Power Supplies". It gives a good introduction to the concepts and terminology. Digest it over an evening...
It's a pdf at national dot com, and a google search took me directly there. I'd post a link but am using daughter's Mac and can't figure out how to make it cut&paste...

old jim
 
  • #13
Thanks vk6kro

appreciate that ... ok on 416 Ohms internal being a bad PSU ... as you realize I just plucked some values, was interesting to see how it worked out.

Thanks jim

have d/l'ed a copy of that myself i will have a good read and will probably learn a few things ;)

here's the link
[/PLAIN]
Application Note 556 Introduction to Power Supplies


cheers
Dave
 
Last edited by a moderator:
  • #14
vk6kro said:
No, that would give you the minumum resistance of the load. You can have any resistor as a load as long as it does not have less resistance than 16 ohms.

Thanks vk6kro that makes pretty good sense. So what if I use a resistor less than 16 ohms? Then won't my supply try to draw too much current, over the max rating?

Ratch said this:

Ratch said:
Your fuse should blow if the current exceeds 1.5 amps. A good voltage supply should regulate its internal impedance to show its set output voltage for no load up to the the max current load it can support.

So in theory does that mean if I use a small resistor less than 16 ohms, the voltage supply should regulate it's internal resistance to something higher to regulate the current to it's max rating?

I am starting to confuse myself.
 
  • #15
nlaham,

So in theory does that mean if I use a small resistor less than 16 ohms, the voltage supply should regulate it's internal resistance to something higher to regulate the current to it's max rating?

I am starting to confuse myself.

If the voltage supply has a feature called "current limiting", which many voltage supplies do, then no problem. Otherwise, the fuse will blow when the current exceeds the max limit. Current limiting is a separate feature, and not part of voltage regulation.

Ratch
 
  • #16
I am looking around and seing most single output power supplies have a slight adjustability in the rated voltage. I can go turn a little screw on the circuit board that will let me go up or down a couple volts. Will this change the rated current output, assuming the supply can handle the same amount of power at different voltages?

So if I turn up the voltage slightly, the max current rating should decrease a little bit right?

What if I turn down the voltage, would be current rating increase, or would it just cap out at the max rating? If the supply doesn't limit it's current, which like you said, most probably would, I could damage the power supply if the supply's components aren't rated to handle larger currents. Is this correct?

Thanks,
Nick
 
  • #17
nlaham,

So if I turn up the voltage slightly, the max current rating should decrease a little bit right?

Unless you analyze the circuit, you have not way of knowing. It is best to regard the max current rating as one value for all voltages.

So if I turn up the voltage slightly, the max current rating should decrease a little bit right?

Same answer as above.

What if I turn down the voltage, would be current rating increase, or would it just cap out at the max rating? If the supply doesn't limit it's current, which like you said, most probably would, ...

Same answer as above.

I could damage the power supply if the supply's components aren't rated to handle larger currents. Is this correct?

The fuse would probably blow first.

Ratch
 
  • #18
If you turn up the output voltage then, on the basis of power dissipation inside the regulator, you would expect to be able to supply more current. Think in terms of putting a fixed series resistor in place of the regulator in order to drop some volts, if the resistor were restricted in the amount of power it could dissipate then, because P = IV, for a smaller voltage drop (V), you could actually increase I.
But that is an over-simplification - just an indication of the direction that things could happen. The worst situation would be if you wanted a very low voltage from your regulator without reducing the current load - think of the possible internal dissipation problems when you drop almost the whole supply volts in the regulator.
 
  • #19
sophiecentaur,

I am not sure what you mean. If my supply is rated at say 35 watts, wouldn't raising the voltage give me less current in order to conserve that power rating?
 
  • #20
It would be nice all power supplies were that predictable.

I have seen power supplies (mostly the ones that plug directly into the wall, sometimes called "wall warts") that run too hot whether they have a load on them or not.

Power supplies are only as good as the way they were designed and this can range from excellent to awful.
They can, and do, self destruct if the output is short circuited, or they may blow a fuse or they may just stop delivering current until the short circuit or oveload is removed. This only happens if there is circuitry there to make it happen.

Modern power supplies, especially the ones you will find in school laboratories, should be regulated and protected from all possible abuse.
The worst thing that can happen to a power supply is to have another power source connect to its output with the polarity reversed. Good power supplies will even cope with that.

The exact figures given on the power supply box are mostly just a guide. If you put a 10 ohms resistor on as a load, instead of 16 ohms, nothing dramatic will probably happen immediately, but if you leave it there for a while, the supply may get too hot, or a fuse may blow or the supply may stop working until it is fixed.

It all depends on how it was designed and cost is often a factor in design. So, if you don't know it is safe, don't do it.
 
  • #21
nlaham said:
sophiecentaur,

I am not sure what you mean. If my supply is rated at say 35 watts, wouldn't raising the voltage give me less current in order to conserve that power rating?

If your power supply were a transformer then you could be correct. The available output power (at least in Volt Amps) would be more or less constant so decreasing V would allow an increase in I.
But a conventional Regulated PSU is not like that. It usually starts with a transformer / rectifier which has a fixed output voltage and a certain rated power output. (I am ignoring the more sophisticated switch mode supplies that do not all operate like this.)
That means that the initial low voltage supply is a bit 'like a battery'; it will supply a fixed voltage with a certain maximum available amount of current. If you want to provide a regulated output, which will be at a lower voltage than this source voltage, you need to have a resistor in series. This is often achieved with a power transistor which can behave like a variable resistor which maintains the same output voltage over a range of load resistances. This internal resistor has to dissipate power. This power will be equal to the current times the voltage dropped across it. The lower the output volts, the more volts need to be dropped inside the regulator - this means more power dissipated. So you may even need to reduce the current into the load to keep this dissipation to a reasonable value.
Just imagine that the series resistor was zero Ohms (giving you maximum available volts). There would be no internal dissipation at all!

If this still makes no sense then read the above again and again until it does. You may be bringing some 'baggage' with you into the question that is preventing you from 'getting it'. It's a common problem when trying to understand some aspect of Electricity :smile:.
 
  • #22
sophiecentaur said:
If this still makes no sense then read the above again and again until it does. You may be bringing some 'baggage' with you into the question that is preventing you from 'getting it'. It's a common problem when trying to understand some aspect of Electricity :smile:.

That makes sense, thanks for the well written explanation. I did read it through a couple of times. My only question out of it, is this:

If a regulated PSU is designed to keep the voltage output constant, why would the voltage vary at the output in the first place? I understand the voltage will drop across the load, but there is no way for the PSU to detect that right? If I am drawing several amps or close to none, shouldn't my voltage in the power supply not change anyway, assuming I don't draw too much current that the PSU can handle and cause the internal voltage to drop?

Thanks for your patience, I am pretty new to all the electrical stuff. But I do like it.
 
  • #23
nlaham,

If a regulated PSU is designed to keep the voltage output constant, why would the voltage vary at the output in the first place?

It won't with voltage regulation. But if the load changes, then the regulator circuit in the PS has to compensate the impedance of the PS to keep the voltage constant.

I understand the voltage will drop across the load, but there is no way for the PSU to detect that right?

Wrong. The regulator circuit can sense the output voltage.

If I am drawing several amps or close to none, shouldn't my voltage in the power supply not change anyway, assuming I don't draw too much current that the PSU can handle and cause the internal voltage to drop?

As I said before, the reason the voltage does not change is because the voltage regulator in the PS keeps it constant.

Ratch
 
  • #24
Ratch said:
nlaham,
It won't with voltage regulation. But if the load changes, then the regulator circuit in the PS has to compensate the impedance of the PS to keep the voltage constant.

As I said before, the reason the voltage does not change is because the voltage regulator in the PS keeps it constant.

Ahh okay, I had something else in my head. So this makes sense to me now I think. This would be the reason why a simple battery's voltage will drop when a load is put on it. So my 3.7V charged phone battery reads 4.1-4.2V with no load, but when I run a current through it, it drops a half volt or so. Where as a regulated power supply would maintain that voltage at any load it is rated to. Am I saying that right?

Thanks. I feel pretty stupid asking these simple questions.
 
  • #25
nlaham,

Yes, regulation makes all the difference.

Ratch
 
  • #26
Ratch said:
nlaham,

Yes, regulation makes all the difference.

Ratch

:approve:
 
  • #27
nlaham said:
That makes sense, thanks for the well written explanation. I did read it through a couple of times. My only question out of it, is this:

If a regulated PSU is designed to keep the voltage output constant, why would the voltage vary at the output in the first place? I understand the voltage will drop across the load, but there is no way for the PSU to detect that right? If I am drawing several amps or close to none, shouldn't my voltage in the power supply not change anyway, assuming I don't draw too much current that the PSU can handle and cause the internal voltage to drop?

Thanks for your patience, I am pretty new to all the electrical stuff. But I do like it.

The regulator is 'intelligent'; it does know about the volts across the load. It senses small changes in the output voltage (comparing it with an internal reference voltage) and adjusts its 'resistance' - using feedback - to keep the output voltage (almost exactly) at the required value. That's the principle of all regulators. They can improve greatly on the voltage fluctuations due to the rectification and the effect of varying the load current. Without a regulator you can expect several volts of change, even; with a regulator, you can rduce this to a few mV. All due to the wonders of negative feedback.
 
  • #28
sophiecentaur said:
The regulator is 'intelligent'; it does know about the volts across the load. It senses small changes in the output voltage (comparing it with an internal reference voltage) and adjusts its 'resistance' - using feedback - to keep the output voltage (almost exactly) at the required value. That's the principle of all regulators. They can improve greatly on the voltage fluctuations due to the rectification and the effect of varying the load current. Without a regulator you can expect several volts of change, even; with a regulator, you can rduce this to a few mV. All due to the wonders of negative feedback.

So I wonder how it uses that feedback, I should probably go read about it rather than keep asking questions.

Does it read the current, and knowing the internal resistance reference it can calculate the voltage drop, and thus lower it's impedance and bring the voltage back up. Anyway, I'm probably wrong, but very cool stuff.
 
  • #29
This link shows a basic voltage regulator circuit. A zener diode is used as a reference voltage and that voltage is compared with a scaled down version of the output volts. The difference between the volts is amplified and fed to the chunky series transistor (the 'variable resistor') to provide just enough current to maintain the output voltage to the desired level. The OP Amp provides loads of gain so the voltage stability is very good even for this simple circuit. It's clever in that the regulator (which is, essentially, an amplifier) gets its power from the supply that it is regulating.
The clever thing about feedback is that it doesn't 'care' what's going on in detail, it just adjusts things until the right result is produced. With all feedback loops, you can't follow the signal round and round in your mind; you just have to imagine what would happen if you perturb the system a bit (e.g. by lowering the supply volts) On that circuit, if you do that then the signal on pin 2 (the inverting input) goes down, causing the output of the OP amp to go up, which turns on the transistor a bit harder - bringing the final out volts up to what you started with. Once the difference in volts between pins 2 and 3 is back to zero, the current from the series transistor stops going up and the output volts are restored. Brilliant, don't you think.
 
  • #30
sophiecentaur said:
This link shows a basic voltage regulator circuit. A zener diode is used as a reference voltage and that voltage is compared with a scaled down version of the output volts. The difference between the volts is amplified and fed to the chunky series transistor (the 'variable resistor') to provide just enough current to maintain the output voltage to the desired level. The OP Amp provides loads of gain so the voltage stability is very good even for this simple circuit. It's clever in that the regulator (which is, essentially, an amplifier) gets its power from the supply that it is regulating.
The clever thing about feedback is that it doesn't 'care' what's going on in detail, it just adjusts things until the right result is produced. With all feedback loops, you can't follow the signal round and round in your mind; you just have to imagine what would happen if you perturb the system a bit (e.g. by lowering the supply volts) On that circuit, if you do that then the signal on pin 2 (the inverting input) goes down, causing the output of the OP amp to go up, which turns on the transistor a bit harder - bringing the final out volts up to what you started with. Once the difference in volts between pins 2 and 3 is back to zero, the current from the series transistor stops going up and the output volts are restored. Brilliant, don't you think.

Welp this is where you lost me. Probably going to stop asking questions now lol. Thanks for all the info. Maybe I'll come back and read that after I get some more experience with power supplies.
 

Related to DC Power Supplies Output Resistance

What is the output resistance of a DC power supply?

The output resistance of a DC power supply refers to the resistance that the power supply presents to the load connected to its output. It is a measure of how well the power supply can maintain a stable output voltage when the load changes.

Why is output resistance important in a DC power supply?

Output resistance is important because it affects the overall performance of the power supply. A lower output resistance means that the power supply can provide a more stable output voltage, even when the load changes. This is especially important in sensitive electronic devices that require a constant voltage supply.

How is output resistance measured in a DC power supply?

Output resistance is typically measured by applying a small load to the power supply and measuring the change in output voltage. The output resistance is then calculated by dividing the change in voltage by the change in load. It can also be calculated using the slope of the output voltage versus load curve.

What factors affect the output resistance of a DC power supply?

The output resistance of a DC power supply can be affected by several factors, including the internal resistance of the power supply components, the design of the power supply circuit, and the type of load connected to the output. In general, a well-designed power supply will have a lower output resistance.

How can the output resistance of a DC power supply be reduced?

The output resistance of a DC power supply can be reduced by using high-quality components, optimizing the design of the power supply circuit, and using feedback control to regulate the output voltage. Additionally, using a buffer amplifier or adding a voltage regulator can also help reduce the output resistance of a power supply.

Similar threads

  • Electrical Engineering
Replies
8
Views
1K
  • Electrical Engineering
Replies
13
Views
2K
  • Electrical Engineering
Replies
23
Views
2K
  • Electrical Engineering
Replies
11
Views
2K
Replies
19
Views
1K
  • Electrical Engineering
Replies
5
Views
692
Replies
13
Views
2K
Replies
2
Views
1K
  • Electrical Engineering
2
Replies
38
Views
795
  • Electrical Engineering
Replies
11
Views
2K
Back
Top