Seniority of Wattage vs. Amperage

In summary: Two problems...First, you have not been clear that you do not intend to try any of this. We are genuinely concerned for your safety and those around you.Second, when you post stuff like that in the forums, other newbies can see it and think that they can do it safely. Not a good idea.
  • #1
mearvk
133
0
Was wondering if I kept the wattage for say a 100 watt bulb constant but varied the voltage would the light be able to continue functioning normally?

Also, assuming this is true does this hold for more complicated circuitry?

Thanks.
 
Engineering news on Phys.org
  • #2
You don't have that freedom. The bulb looks at the voltage and decides what current it will draw. The power is then VxI.
 
  • #3
Antiphon said:
You don't have that freedom. The bulb looks at the voltage and decides what current it will draw. The power is then VxI.

The bulb sees 120 volts. We know it would draw about .83 amps, right? 120x.83 = 100

The bulb sees 240 volts. Would it then draw half as many amps?

The bulb sees 60 volts. Would it then draw twice as many amps?

More generally, does varying the voltage affect the wattage of a light bulb?
 
Last edited:
  • #4
mearvk said:
The bulb sees 120 volts. We know it would draw about 8.3 amps, right? 120x8.3 = 100

The bulb sees 240 volts. Would it then draw half as many amps?

The bulb sees 60 volts. Would it then draw twice as many amps?

More generally, does varying the voltage affect the wattage of a light bulb?

Are you familiar with Ohm's Law? If you increase the voltage across a resistor, what does that do to the current through the resistor?

I = V/R

P = V^2/R = I^2 * R
 
  • #5
Antiphon said:
You don't have that freedom. The bulb looks at the voltage and decides what current it will draw. The power is then VxI.

Never anthropomorphise lightbulbs. They hate it when you do that. :-p
 
  • #6
So if I wanted to keep 100 watts going to a bulb with 144 ohms of resistance and I doubled the voltage to 240 I'd need to add 432 ohms (576-144) worth of resistance to the circuit?

So in theory I could run a 120v 100 watt light bulb on a 240v line if this were added: http://goo.gl/vy2Ig

Thanks for fielding my questions guys.
 
Last edited by a moderator:
  • #7
So if I wanted to keep 100 watts going to a bulb with 144 ohms of resistance and I doubled the voltage to 240 I'd need to add 432 ohms (576-144) worth of resistance to the circuit?

So in theory I could run a 120v 100 watt light bulb on a 240v line if this were added: http://goo.gl/vy2Ig

Don't, Don't and Don't again.

Apart form the fact that what you propose will not work, This is a serious safety issue. 240 volt mains is seriously more dangerous than 120 volt mains.

If you must run a 120 volt bulb from 240 then run two in series. This will work safely.
Do not connect them in parallel, all you will achieve is two blown bulbs.

Do you understand what series means?
 
Last edited by a moderator:
  • #8
Well is my math at least right?

If so, why would it be dangerous aside from the extra voltage?

Could you explain simply the diff between series and parallel?
 
  • #9
Instead of trying to tell experts (and there are quite a few more expert than I am) here how to do something, how about just explaining your goal ie what you want to achieve and asking for help.
 
  • #10
mearvk said:
Well is my math at least right?

If so, why would it be dangerous aside from the extra voltage?

Could you explain simply the diff between series and parallel?

You don't understand the difference between series and parallel circuits, and you are wanting to start off working with AC Mains circuits? That's not a good thing to do. Please learn the basics of electricity and electronics first, and then find a good local mentor who can help you safely learn about working with AC Mains circuits. The shock and fire hazards are very real when working with those kinds of voltages and that much available power.

Here is your starter on series and parallel circuits:

http://en.wikipedia.org/wiki/Series_and_parallel_circuits

.
 
  • #11
Berkeman you seem to have a short circuit between 'theoretical' and 'actual'. I can ask questions all day about 240v circuits without being silly enough to try to power a 100w light bulb off of it.
 
  • #12
mearvk said:
Berkeman you seem to have a short circuit between 'theoretical' and 'actual'. I can ask questions all day about 240v circuits without being silly enough to try to power a 100w light bulb off of it.

Two problems...

First, you have not been clear that you do not intend to try any of this. We are genuinely concerned for your safety and those around you.

Second, when you post stuff like that in the forums, other newbies can see it and think that they can do it safely. Not a good idea.

We take safety seriously here at the PF. This thread is closed.
 

Related to Seniority of Wattage vs. Amperage

1. How does wattage differ from amperage?

Wattage and amperage are two different units used to measure electricity. Wattage measures the rate at which energy is used, while amperage measures the amount of current flowing through a circuit at a given time.

2. Which is more important, wattage or amperage?

Both wattage and amperage are important in understanding the flow of electricity. However, wattage is generally considered more important because it takes into account both voltage and amperage, providing a more accurate measure of power consumption.

3. How are wattage and amperage related?

Wattage and amperage are related through Ohm's law, which states that wattage is equal to voltage multiplied by amperage. This means that wattage can be affected by changes in either voltage or amperage.

4. Does higher wattage mean more power?

Yes, higher wattage generally means more power. However, it is important to note that the amount of power a device consumes is not solely determined by wattage, but also by the efficiency of the device and the length of time it is used.

5. Can wattage and amperage affect the performance of electronic devices?

Yes, wattage and amperage can greatly affect the performance of electronic devices. Using a device with a wattage or amperage that is too high can cause damage, while using a device with a wattage or amperage that is too low can result in poor performance or even failure to function.

Similar threads

  • Electrical Engineering
Replies
11
Views
3K
Replies
12
Views
2K
  • Electrical Engineering
Replies
4
Views
3K
Replies
6
Views
1K
Replies
3
Views
2K
Replies
5
Views
4K
  • Introductory Physics Homework Help
Replies
2
Views
3K
Replies
8
Views
2K
Replies
32
Views
5K
  • Engineering and Comp Sci Homework Help
Replies
12
Views
4K
Back
Top