Ohms and wattage of resistor

In summary, the resistor on this circuit fried, so I can't tell anything about the wattage or ohms. However, based on the estimated voltage drop and current, I would guess it's a 1/2-Watt, 10-Ω resistor.
  • #1
vincemash
3
0
I am trying to determine ohms and wattage of R2 on attached schematic.

The resistor fried so I can't tell anything about it

This circuit is a recharger for a cordless black and decker drill

Thanks
 

Attachments

  • schematic.jpg
    schematic.jpg
    20.5 KB · Views: 1,105
Engineering news on Phys.org
  • #2
I would guess that it's a 1/2-Watt, 10-Ω. That is based on the estimated voltage drop of 2 V and current ~200 mA. You can also make a rough estimate on the voltage across R2 this way: ILED ~ 20 mA, VLED ~ 1 V (if red) ⇒ VR2 ~ 1 V + (20 mA)(100 Ω) = 3 V. Those numbers are just rough estimates for the on-state of the LED. Then, a 10-Ω will give you a rough current output a bit above 100 mA, which in turn gives you a wattage around 1/2-Watt. I might go with a 1-Watt resistor, though. I don't think it's supposed to double as a fuse.
 
  • #3
The LED forward voltage drop is usually more like 2V for a red LED, isn't it?
 
  • #4
berkeman said:
The LED forward voltage drop is usually more like 2V for a red LED, isn't it?

Yes, around 1.9 V is the typical nominal forward voltage for red LEDs. Also, 20 mA is the typical Imax, so the usual conservative design value is closer to 10 mA.
 
  • #5
If you were designing this from the start, you would have to accept that someone could attach a completely flat battery to it.

The power supply can deliver about 14.8 volts at 210 mA so to limit the current into a flat battery to 210 mA, you would have to put 70 ohms in series with it. 14.8 volts / 70 ohms = 210 mA.

Then if you wanted the LED to be fully lit at this charge rate, it would have to have a resistor of about 650 ohms in series with it. (14.8v - 1.8v)/0.02 A . 680 ohms would be OK.

This LED current is now 20 mA the 70 ohm resistor doesn't have to supply, so it could now have a value of 77 ohms. 14.8 / 77 = 192 mA. This resistor would dissipate 2.84 watts so it should be a 5 watt resistor and 82 ohms is the nearest preferred value.

So, now it is a safe device, but it will only deliver about 34 mA to a 12 volt battery and 10 mA into a battery that had reached 14 volts. Still, it is a simple charger and an automatic reduction in charging current is a good outcome.
The LED would be almost completely dimmed when the battery was fully charged, with about 1 mA flowing in it.

To summarise, I would put a 82 ohm 5 watt resistor in series with the battery and a 680 ohm 0.5 watt resistor in series with a LED across the 82 ohm resistor.
This would give a short circuit current of about 200 mA into a very flat or faulty battery.
 
Last edited:
  • #6
The LED requires about 20mA to operate. Therefore with the 100ohm resistor in series the voltage across this circuit for the LED to light (when the battery is charging) is 3.7V. This means that the voltage across your burned out resistor while it is supplying current to the battery should be maximum 3.7V. Now the voltage across this resistor will change as the battery goes from discharged to charged. So if we take a fully discharged battery the maiximum voltage across this resistor can only be 3.7V. If we want a charging current of approximately 200ma to a flat battery then R=V/I therefore R=18.5ohms. An 18ohm (1W)resistor is the closest.
This doesn't protect the system from overcurrent but the battery will have rsistance even when it is flat. Also T1 may go into saturation when the current demand is too high limiting the current.
It is not ideal but trial and error will also help. Make sure you attach a discharged battery once you have fitted the resistor to see if it burns out again.

http://www.calibrepower.com"
 
Last edited by a moderator:
  • #7
Make sure you attach a discharged battery once you have fitted the resistor to see if it burns out again.

This drill will have NiCd batteries in it.
These discharge spontaneously at about 25 % per month. So, in hobby use, they are very likely to run flat before you need to use them.
So, this charger will have flat batteries to deal with quite often.

Also, NiCd batteries that have failed often fail in a completely short circuited way. Zero volts and zero ohms.

NiCd batteries are also very sensitive to overcharging. Give them 100 mA for 2 days and they will probably be destroyed. This charger has no timer or overcharging detector, so it is almost guaranteed to destroy its batteries. Anybody can just put the batteries on charge and forget about them until it is too late.

A few resistors burning up is trivial because they are cheap, but a 13 volt bank of NiCd batteries in a special holder would probably cost more than the drill was worth.
I try to avoid blowing up resistors when a bit of simple design can make the device immune to it.

If it gets a flat battery with 18 ohms in series with it, it will send about 820 mA through the 18 ohm resistor (14.8 / 18 = 822 mA) which will then dissipate about 12 watts. (14.8 * 0.822 = 12.16 watts).

This is probably what they had before and probably why it blew up. So, why just do it again?

You can't depend on the transformer saturating. This transformer is rated at 10 watts.
 
  • #8
vk6kro said:
NiCd batteries are also very sensitive to overcharging. Give them 100 mA for 2 days and they will probably be destroyed. This charger has no timer or overcharging detector, so it is almost guaranteed to destroy its batteries. Anybody can just put the batteries on charge and forget about them until it is too late.

So, how many volts (per cell) is reasonably safe for NiCd's? I ask because I have a couple of NiCd cordless tools, and it's possible to overcharge both. I have a simple modification in mind, using Zener diodes, to limit the charger voltage. I was thinking 1.4V per cell, but am interested in what other's have to say about that.
 
  • #9
Greetings Redbelly!

The accepted value of current for a NiCd cell is about 10 % of its amp-hour rating figure.
I haven't seen any justification for this, but it seems harmless enough. Generally, this puts you in the 100 to 200 mA current range.
The batteries should NEVER feel hot. If they do, you have probably already done some damage.

It is most important that you control the current and let the batteries decide their own terminal voltage. You can control the current best if you use a current regulator, but if you have to use a resistor, you should have a supply voltage at least 50% higher than the fully charged voltage of the battery.

And put a timer on it. If you don't, you will certainly forget and overcharge it.

Or, you could design it to taper off like I did with this post example. It becomes a trickle charger as the voltage gets higher.

The voltage will start from zero and rise to about 1.4 volts per cell. This will settle to about 1.3 volts after a short time off charge.

I have an almost new cordless drill that I have to run via a short cable from a 12 volt battery because, 5 years ago, its charger managed to cook the batteries in it.

I use NiMH batteries in digital cameras and I destroyed a few before I realized that the camera would stop working before the battery was fully discharged. So, if I gave the battery a full charge according to its amp-hour rating, I was overcharging it.

So, I have made a new charger using a Picaxe chip which detects when the battery is fully charged and then removes the supply of current. The trick is to turn off the power every minute and wait one second then measure the voltage on the battery. If I get 20 successive identical or lower readings, I consider the battery is full charged.

The Picaxe 14 chip has a very stable 10 bit A to D converter which allows me to get away with doing it this way.

This doesn't mean the battery is any good, though. Poor batteries reach this state quite quickly, but discharge equally quickly.
 
  • #10
I had made a few assumptions.
1. that this was the actual charger supplied by Black and decker.
2. that the output voltage was 14.8V
First of all the output of the transformer is 15.8V. Times this by 1.414 for full wave rectification. = about 22Vdc.
Therefore the battery it is charging is probably a 18V nominal battery. This means it has 15 cells. therefore at full voltage the charger can only charge the batteries at 1.46V (22V/ 15cells. It cannot ever over charge the batteries unless there is a problem with the transformer.
Also the cables and many other parts will have resistance and volt drop.
I do agree that a fully discharged battery did fry the resistor but I am sure trial and error will help you out. I still think 18ohms is a good staring point.
Let us know how you got on please.

http://www.calibrepower.com"
 
Last edited by a moderator:
  • #11
Sorry for resurrecting an old thread.

I have similar problem as a first poster.

Power supply for my charger got broken beyond repair as well as R2 resistor so i did some research which was wrong and bought too powerful power supply 15V 1,5A. I did repair battery pack by changing cells so it is new. Basically i need to drop down current a lot to fit in 100-200mA range. Does anybody have an idea how to do it. I've calculated that i need 11 Ohm 20W resistor on R2. Is there any other way?
 

Related to Ohms and wattage of resistor

1. What is the relationship between ohms and wattage in a resistor?

The relationship between ohms and wattage in a resistor is defined by Ohm's Law, which states that the voltage across a resistor is directly proportional to the current passing through it, and inversely proportional to the resistance of the resistor. This means that as the resistance (measured in ohms) increases, the wattage (measured in watts) also increases.

2. How do I calculate the wattage of a resistor?

The wattage of a resistor can be calculated using the formula P=VI, where P is the power (wattage), V is the voltage, and I is the current. Alternatively, you can use the formula P=I^2R, where R is the resistance in ohms.

3. What is the standard unit of measurement for resistance?

The standard unit of measurement for resistance is the ohm (symbol: Ω). This unit is named after the German physicist Georg Ohm, who first discovered the relationship between voltage, current, and resistance.

4. Can a resistor have a wattage rating higher than its stated value?

Yes, a resistor can have a wattage rating higher than its stated value. This is known as the power rating or power dissipation rating, and it indicates the maximum amount of power the resistor can safely handle without overheating.

5. Why do resistors have different wattage ratings?

Resistors have different wattage ratings because they are designed to handle different amounts of power. Higher wattage resistors can handle more power without getting damaged, while lower wattage resistors are more suitable for circuits with lower power requirements. The wattage rating is also an important factor to consider when selecting a resistor for a specific application.

Similar threads

Replies
11
Views
1K
  • Electrical Engineering
Replies
3
Views
2K
  • Electrical Engineering
Replies
14
Views
850
  • Electrical Engineering
2
Replies
36
Views
4K
  • Electrical Engineering
Replies
27
Views
2K
Replies
12
Views
2K
Replies
20
Views
2K
Replies
14
Views
2K
  • Electrical Engineering
Replies
4
Views
1K
  • Electrical Engineering
Replies
20
Views
3K
Back
Top