Understanding Lightbulbs: Solving the Confusing Voltage Question in Physics

In summary, the conversation discusses the question of how much the voltage needs to drop for a 60 watt lightbulb to start glowing like a 40 watt bulb, with no current given. The speaker initially tries to find the resistance of each bulb but realizes that both current and resistance can vary with the same voltage. They end up using the ratio of voltages to find the voltage drop, but it is pointed out that this does not take into account the variance in efficiency and current. The speaker then corrects their calculation and finds the voltage drop to be 76V. They also mention finding an equation with only watts, voltage, and resistance to solve for the voltage, but initially make a mistake in their calculation. They eventually find the
  • #1
Jarfi
384
12
So I got this wrong in my physics test, the question goes: The original voltage is 230 v and the lightbulbs are marked so said voltage is assumed for them. How much does the voltage need to drop for a 60 watt bulb to start glowing like a 40 watt bulb.

I first tried to find the resistance/ohms of each bulb but couldn't since no current is given. I tried to find a current but realized that both the current and ohms could vary and give the same voltage, V=RI. So there is no fixed current or voltage to work with so for all I know this could be 1000 amps and 0,230 ohms... I ended up doing the only thing i could think off which was:

40w/60w*230v=153... 230-153=77 voltage drop.

My test was flawless apart from this and all the teacher wrote was "no" on my answer -_-

Can anybody explain lightbulbs and how this works to mee.. or ill start loosing sleep over this;)
 
Physics news on Phys.org
  • #2
You should know the power relationships, use the one that relates power, resistance and voltage to find the resistance of each bulb. Go from there.
 
  • #3
Theoretically, you should take into account the fact that efficiency (light power output/electrical power input) varies with electrical power. And that current = V/R but R varies with voltage (current) also. So the total picture is pretty complicated.

I'd say a few assumptions need to be made. Depending on how rigoreous your course is, the simplest assumption is constant filament resistance and constant efficiency.

You just took the ratio of voltages = ratio of watts. But if I drop the voltage in half, do I really drop the power in half? I don't think so. So I would look at those ratios again.
 
  • #4
rude man said:
Theoretically, you should take into account the fact that efficiency (light power output/electrical power input) varies with electrical power. And that current = V/R but R varies with voltage (current) also. So the total picture is pretty complicated.

I'd say a few assumptions need to be made. Depending on how rigoreous your course is, the simplest assumption is constant filament resistance and constant efficiency.

You just took the ratio of voltages = ratio of watts. But if I drop the voltage in half, do I really drop the power in half? I don't think so. So I would look at those ratios again.

There is no account of efficiency or such. We are given the law V=AI, so i can simply use that to find the current: 60W/230V=0,26A... then x=230v/0,26=885ohms. Then I know the power has dropped to 40W so I find the voltage corresponding: 40W=0,26A*XV->x=154 volts, that is the volt drop is 230-154=76V

Then I think, oh ofc the current must've dropped with the voltage uh let's use an equation with only watts, voltage and resistance... I=V/ohm so P=V*V/ohm that is P=v^2/ohm so 40w=v^2/885ohms->> v^2=0,0452 so V=0,21? wtf? totally wrong and different results.

I mean this totally and definately depends on current and resistance.. and neither is give, you could have high current and low resistance or opposite and have the same results, what am i doing wrong?

nevermind, I found out... lol 40w=v^2/885ohms->> v^2=35400(ACCIDENTALLY DEVIDED) so V=188. Case closed.
 
Last edited:
  • #5


I can understand your frustration with this question and your desire to fully understand how lightbulbs work. First, let's start by discussing the concept of voltage. Voltage refers to the potential difference between two points in an electrical circuit. In this case, the original voltage of 230 v means that there is a potential difference of 230 volts between the positive and negative ends of the circuit.

Now, let's move on to understanding the relationship between voltage, power, and resistance in a lightbulb. The power of a lightbulb is measured in watts and is the amount of energy it uses per unit time. In this case, a 60 watt bulb uses more energy per unit time than a 40 watt bulb. This means that the 60 watt bulb has a higher power rating.

The resistance of a lightbulb is a measure of how much it resists the flow of electricity. In simpler terms, it is how difficult it is for electricity to pass through the bulb. The resistance of a lightbulb is affected by its design and materials.

Now, let's answer the question at hand. The voltage needs to drop for a 60 watt bulb to start glowing like a 40 watt bulb because the 60 watt bulb has a higher power rating and therefore requires more energy to light up. This means that the resistance of the 60 watt bulb is higher than the 40 watt bulb, and it needs a higher voltage to overcome this resistance and start glowing.

To calculate the exact voltage drop needed, we can use the formula P=V^2/R, where P is power, V is voltage, and R is resistance. We know that the power of the 60 watt bulb is 60 watts, and the power of the 40 watt bulb is 40 watts. We also know that the original voltage is 230 v. So, we can rearrange the formula to solve for the voltage drop:

P=V^2/R
60=V^2/R (for the 60 watt bulb)
40=V^2/R (for the 40 watt bulb)

Now, we can set these two equations equal to each other and solve for V:

60=V^2/R
40=V^2/R
60/40=V^2/40
3/2=V^2/40
V^2=60
V=√60
V=7.75 v

This
 

Related to Understanding Lightbulbs: Solving the Confusing Voltage Question in Physics

1. What is voltage and how is it measured?

Voltage is a measure of the difference in electrical potential between two points in an electrical circuit. It is commonly measured in volts (V) using a voltmeter.

2. What causes voltage to change?

Voltage can change due to various factors such as changes in resistance, changes in current, or changes in the power supply.

3. How does voltage affect electrical devices?

Voltage is necessary for electrical devices to operate. A lower voltage may result in the device not functioning properly, while a higher voltage can damage the device.

4. Can voltage be controlled?

Yes, voltage can be controlled through the use of devices such as voltage regulators or transformers.

5. What is the relationship between voltage and current?

Voltage and current are directly proportional, meaning that an increase in voltage will result in an increase in current, and vice versa. This relationship is described by Ohm's Law: V=IR, where V is voltage, I is current, and R is resistance.

Similar threads

  • Introductory Physics Homework Help
Replies
7
Views
861
  • Introductory Physics Homework Help
Replies
3
Views
605
  • Introductory Physics Homework Help
Replies
5
Views
2K
  • Introductory Physics Homework Help
Replies
10
Views
887
  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
3
Views
795
  • Classical Physics
2
Replies
57
Views
7K
  • Introductory Physics Homework Help
Replies
13
Views
777
  • Introductory Physics Homework Help
Replies
2
Views
3K
  • Introductory Physics Homework Help
Replies
2
Views
872
Back
Top