- #1
Jarfi
- 384
- 12
So I got this wrong in my physics test, the question goes: The original voltage is 230 v and the lightbulbs are marked so said voltage is assumed for them. How much does the voltage need to drop for a 60 watt bulb to start glowing like a 40 watt bulb.
I first tried to find the resistance/ohms of each bulb but couldn't since no current is given. I tried to find a current but realized that both the current and ohms could vary and give the same voltage, V=RI. So there is no fixed current or voltage to work with so for all I know this could be 1000 amps and 0,230 ohms... I ended up doing the only thing i could think off which was:
40w/60w*230v=153... 230-153=77 voltage drop.
My test was flawless apart from this and all the teacher wrote was "no" on my answer -_-
Can anybody explain lightbulbs and how this works to mee.. or ill start loosing sleep over this;)
I first tried to find the resistance/ohms of each bulb but couldn't since no current is given. I tried to find a current but realized that both the current and ohms could vary and give the same voltage, V=RI. So there is no fixed current or voltage to work with so for all I know this could be 1000 amps and 0,230 ohms... I ended up doing the only thing i could think off which was:
40w/60w*230v=153... 230-153=77 voltage drop.
My test was flawless apart from this and all the teacher wrote was "no" on my answer -_-
Can anybody explain lightbulbs and how this works to mee.. or ill start loosing sleep over this;)