Why is it important to consider current and voltage when thinking about power?

In summary: They are designed to take the low voltage and high current that comes from the power lines and convert it into usable AC power. This is a hugely important job and one that is fraught with danger. If you had a power failure and your receptacles were not working, you would be without AC power. You would need to find a lighter to start a fire, use a flashlight, or go outside to use the power lines to recharge your battery.In summary, electrical power is current times voltage. Voltage is energy per unit charge, and power is the rate at which energy is transferred. The max power theorem tells us that in the connection between a source and a load,
  • #1
fisico30
374
0
Hello Forum,

electrical power is current times voltage.

Current is charge per unit time, crossing an hypothetical surface: the more people cross a door per unit time the higher the current.

Voltage is energy per unit charge: the more energy each person crossing the door, the more the power.

So to have large power, we need lots of people crossing the door, each with a high (kinetic energy), correct?

In some cases we have high voltage, low current, hence low power: that means few charges are moving across a surface but those few charges have high energy...

The max power theorem tells us that in the connection between a source and a load, the max power is transferred at a certain optimal situation, when there is a large number of changes (not the largest) with large energy (but not the largest)...correct?

All devices need power: some need high current and low voltage or vice versa...why?
thanks,
fisico30
 
Engineering news on Phys.org
  • #2
You car needs super low voltage ( 12 volts so you don't get shocked) and relatively high current to work.

Blender needs low voltage and low amps.

AC unit at your house needs slightly higher...but still low voltage with higher amps. (needs decent of amount of power to run)

Factories motors need low voltage (480 V) you could say medium...but I wouldn't...and low to medium current.

Power lines need super high voltage and super low current to save customers (you) money. With massive amount of power...lower current always has lower power losses.

Quick answer enough or you want more?
 
  • #3
The relationship is very much on a par with torque, rotary speed and Power. You can choose many combinations of two variables to achieve your wanted Power. It's the practical circumstances that dictate the best choice of values.
 
  • #4
All devices need power: some need high current and low voltage or vice versa...why?

It isn't just a choice of one or the other. Generally, the voltage will be decided first. This may be your mains supply voltage or a 12 V battery. Something fixed that you want to use.

Then a device is designed to give the required result using that voltage as a power source.

Generally, you don't decide how much power you want to use. You might select a pump that will pump a certain amount of water per minute. Then, you look at the label or data sheet and see how much current it is going to use and how much power.

If it is going to switch off your circuit breakers or flatten a 12 volt battery in a few minutes, this is where you start to earn your salary.

You may be able to compromise and pump the water a bit slower and use a less powerful pump.
 
  • #5
...It's the practical circumstances that dictate the best choice of values.

For any given power, available voltage sets the current that will be required.

When the size of the wire required to carry that required current gets cumbersome, one goes to next higher voltage to reduce wire size.

Such are the tradeoffs.

The max power theorem tells us that in the connection between a source and a load, the max power is transferred at a certain optimal situation, when there is a large number of changes (not the largest) with large energy (but not the largest)...correct?

Now that's one you need to figure out by working some examples on paper, for there's a mild subtlety to that theorem, at least the way they worded it when i learned it ca 1961 there was..

So take say, a one volt source, and place in series with it several values of external and internal resistance. Maybe 0.1, 1.0 and 10 ohms that'd be only nine combinations.
Tabulate how much power is dissipated internally and externally in each case.. with those numbers the arithmetic is trivial and you'll quickly see the rule..
 
  • #6
jim hardy said:
For any given power, available voltage sets the current that will be required.

That voltage would have been selected by someone, in the first place, according to the circumstances they foresaw for the application. A good example would be in the choice of 12V OR 24V for a vehicle electrics - based on the likely size of engine and the starter motor that would be required. The poor devil who finds himself with a 24V system then has to grin and bear the fact that all his replacement electrical items will cost him an arm and a leg.
 
  • #7
The power lines are another fine example.

Way back when they litterally needed to run millions of miles of electric cable.

This has a cost for material and labor. Using the smallest cable possible would be the obvious choice. Small cable...smallish current. To make the massive amount of power necessesary...a super high voltage must be used which is the case in all power lines.

The receptacles in your house are another good example. They wanted to have good power with a voltage that wouldn't kill you. In USA they decided on 120 volts...with a 20 amp breaker in general. Seems to work just fine.

In factories...for long distances across super large factories...they will sometimes use 11,500 voltage cables to deliver power from one end to the other...then step it down in a transformer to make it 480 volts for distribution in that area. High voltage is great for avoiding voltage drops.

As said above...application is everything.
 
Last edited:
  • #8
psparky said:
In USA they decided on 120 volts...with a 20 amp breaker in general. Seems to work just fine.

Actually, I'm not so sure that they did do "fine" because they need their so-called split phase system to give them 240V for high powered domestic equipment so as to avoid needing immensely thick cables. Too late to change now but the 120V standard was clearly chosen with only lighting and light equipment in mind.
But is has proved to be a rich vein for discussion of PF!
 
  • #9
sophiecentaur said:
Actually, I'm not so sure that they did do "fine" because they need their so-called split phase system to give them 240V for high powered domestic equipment so as to avoid needing immensely thick cables. Too late to change now but the 120V standard was clearly chosen with only lighting and light equipment in mind.
But is has proved to be a rich vein for discussion of PF!

I agree with what you are saying, but it actually does work just fine. And it even runs strong circular saws and air compressors at 120 volts. If you need more power...you just beam up Scotty...or you run a 240 volt line.

That reminds me of the old school fuse panel I have in my house off of a main breaker panel. I've considered replacing the fuse panel for a breaker panel...but I'm often asked if I'm having any problems with it. The answer is no...and the reply of the electrician I am speaking with is ussually..."then why change it?". I mean I've never even popped a fuse. Works perfect without exception. If it aint broke...don't fix it.
 
  • #10
Copper is getting more and more expensive, though. . . . . . .
 
  • #11
Simple related question:

Say device A needs 10 W of power and a voltage of 5V. The current must be 2 A.

Suppose a source is connect to device A. the source can provide the correct voltage, 5V.
I would assume that once that is accomplished the device will draw the current it needs, i.e. 2A, correct?

That is all good as long as the source is able to emit 2A of current, correct? If the max current output of the source is, say, 1 A, then the device will not work properly...

I guess there must be current continuity:what goes into device A must be what comes out of the source. Figuratively speaking, device A is pulling charge in the conductors to cause a current of a certain size but the inner working of the source don't allow such a current...

I naively assumed that once a device was provided a certain voltage, the current that enters the device itself would only depend on the inner impedance of the device and nothing else...

thanks,
fisico30
 
  • #12
Yes, you have most of that OK.

Sometimes the supply can actually supply the current for the load, but it would eventually heat up too much and may be destroyed.

More usually, the supply has some internal resistance that stops it delivering more current at the correct voltage.
What follows is a bit of simple maths. Try to work through it one line at a time and you will get an idea of what actually happens.

Suppose you had a 6 volt supply with 0.5 ohms of internal resistance in series with it.
Now, you connect a 2.5 ohm load on it.
The combined resistance is 0.5 + 2.5 ohms, or 3 ohms, so the current is (6 volts / 3 ohms) or 2 amps.

The load will have a voltage of I * R or 2 amps * 2.5 ohms ie 5 volts across it.
The other volt is lost inside the battery across the internal resistance.
This is the example you gave.

Can you see what would happen if you connected a 1 ohm resistor?
It would make a total of 1.5 ohms with the internal resistance (1 ohm + 0.5 ohms) so the current would be (6 volts / 1.5 ohms) or 4 amps but it would only have 4 volts across the 1 ohm resistor.

In fact this supply could give 12 amps into a short circuit, limited only by the internal resistance, but it can only give 2 amps at 5 volts and only if the load is 2.5 ohms.

This sort of behaviour of power supplies and batteries was the reason that regulated supplies were developed. These give a steady voltage out, almost regardless of load (up to a maximum current, of course).
 
  • #13
Hello vk6kro,

thanks a lot. what I am trying to do is charge a small electronic device with a solar cell.
The solar cell has a certain output voltage. I don't know the impedance of the electronic device. Based on its impedance and the impedance of the solar cell, a certain output current will be generated...

The goal is to be able to transfer all the power generated by the cell to the device.

The device probably operates at a DC voltage that is different from the DC voltage offered by the solar cell. Do I need a DC to DC converter to match the two voltages? I would think so. If the device voltage is less than the solar cell voltage, I can use a diode to prevent power to go from the device to the cell...but my objective is really to charge the electronic device...

Once the voltages are the same, the problem that the impedances of the solar cell and electronic device are different remains...What should I do about that?

In summary, what do I need to buy/build to transfer the most power from the cell to the device?

thanks
fisico30
 
  • #14
stef6987 said:
I think becuse in some cases such as power lines they want to reduce power loss, but that's one of many examples

Yes...and even more important...

Higher voltage means lower current...which means reduced cable size.

Multiplied by millions of miles of overhead lines...this means millions and millions of dollars saved in materials and labor for installing smaller diameter overhead lines.

Money...money...money...
 

Related to Why is it important to consider current and voltage when thinking about power?

1. What is the equation for power (P) in terms of current (I) and voltage (V)?

The equation for power is P=IV, where P represents power in watts (W), I represents current in amperes (A), and V represents voltage in volts (V). This equation is a fundamental relationship in electricity and is used to calculate the amount of power consumed by an electrical device.

2. How is power related to current and voltage?

Power is directly proportional to both current and voltage. This means that as either current or voltage increases, power will also increase. In other words, if the current or voltage doubles, the power will also double. This relationship can be seen in the equation P=IV, where the variables are multiplied together.

3. Can power be calculated using other equations besides P=IV?

Yes, power can also be calculated using the equations P=I2R and P=V2/R, where R represents resistance in ohms (Ω). These equations are derived from Ohm's law (V=IR) and are used to calculate power in circuits with resistors.

4. How is power measured?

Power is measured in watts (W), which is a unit of energy per unit time. This means that power is a measure of how much energy is used or transferred per unit of time. For example, a 100 watt light bulb uses 100 joules of energy per second.

5. What is the significance of power in everyday life?

Power is a crucial concept in understanding and using electricity in our daily lives. It allows us to measure the amount of energy being used by electrical devices and helps us understand how much electricity we are consuming. It is also important in engineering and designing electrical systems, as it determines the capacity and efficiency of these systems.

Similar threads

  • Electrical Engineering
Replies
10
Views
946
  • Electrical Engineering
Replies
2
Views
1K
Replies
15
Views
2K
Replies
3
Views
899
Replies
4
Views
2K
Replies
6
Views
1K
  • Electrical Engineering
Replies
10
Views
4K
Replies
7
Views
2K
Replies
4
Views
2K
Replies
19
Views
4K
Back
Top