efficiency of using a resister to limit current

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
137
Yes, a resister could theoretically be used to limit the current to power an LED. I'm going to talk about the efficiency level of this approach.

------[////]------(O)------

The power lost to resistance is going to be proportional to the voltage drop across the resister. Well what's that going to be? Voltage drop in a resister is the product of the resistance (in ohms) multiplied by the current able to flow through it. How many ohms of resistance we will need depends on the voltage potential, and the voltage potential across the resister is going to be a function of the amount of voltage being supplied to the circuit minus the operational voltage drop taken up by the LED.

In other words, the LED behaves kind of similar to the resister in this respect. If the rated voltage of the LED is 3v and the circuit is being supplied with 6v of power, then the amount of resistance required for the resister to limit current to the level needed for the LED is going to end up meaning that both the LED and the resister are going to be consuming an equal amount of power. Half the power in the circuit will be lost to heat in the resister.

So theoretically, for maximum efficiency one would want to reduce the voltage being supplied to the circuit as much as possible. Suppose the LED is 3.2v and we want to supply the circuit with 3.4 volts. What's the problem with that?

The problem is, the smaller the resistance the less the circuit is going to be able to deal with voltage fluctuations. With a resistance that small, a tiny increase in voltage potential is going to lead to a very big increase in current able to flow through the circuit. In fact that's the whole reason we're even using a resister in the first place. Theoretically, if we had an absolutely stable voltage supply that supplied the exact amount of voltage that the LED was rated for we wouldn't need a resister; the natural resistance in the LED would be sufficient to limit current. But that's not how things work in practice. Virtually any power supply coming from a transformer is going to have voltage spikes that go significantly above its working output voltage. LEDs don't get burned out from average power levels; even a tiny fraction of a second of excessive current going through can cause a burnout.

So there is a little bit of a trade-off. The larger the amount of power drop being consumed by the resister relative to the LED, the more voltage fluctuation the circuit will be able to handle without risk of burning out the LED.

Normally a simple resistor is not used for these applications, except for extremely low power little 5mm indicator light LEDs (where the efficiency level is not that critical since the power consumption is so small). This is just a perspective on the approach.
 

JoakimFlorence

Newly Enlightened
Joined
Jun 4, 2016
Messages
137
Now of course we could just use a higher power LED rated for more current (to underdrive it) but in many cases this can be more expensive.
(Simply connecting two separate lower power LEDs in parallel isn't necessarily such a simple solution either, if anyone is familiar with the complexities of trying to use two diodes in parallel in a circuit)
Still, even if we are underdriving the LED by half its rated current, the level of resistance is going to still have to be substantial. The relationship of voltage to current able to pass through an LED isn't exactly the same simple linear relationship of a resister. The more voltage drop being taken up by the resister relative to the LED, the more linear the overall relationship between voltage and current is going to be in the circuit. LEDs can be difficult because they have a narrow range of operating voltages. Go below it and the LED will not be able to conduct any current. Start going above it and there's very little to stop unlimited current from going through.

Now in the old days, you could just haphazardly wire a 5mm indicator LED in series with a resister and haphazardly connect it to a 9v battery. It probably didn't matter (in most cases) too much exactly what the resistance was, you could still see it light up. Now of course the LED was often being undriven, way below its rated maximum power. Which likely did not matter in that case since it was just an indicator light. And more power was being lost in the resister than used in the actual LED light.

Once the voltage drop (and thus power consumption) being taken up by the LED, or string of LEDs wired in series, becomes large in comparison to the voltage drop taken up by the resister, you can't just so haphazardly pick a resister without doing some calculations beforehand. That's because when the voltage relative to the ohms of the resistor is larger, a percentage variation in ohms is going to result in a larger difference in current levels. The LED is much more likely to not visibly light up at all, or it might burn out.

Now you can see why when the efficiency of the circuit is low it's much easier to power an LED with using a simple resister.
 
Last edited:

rayman

Flashlight Enthusiast
Joined
May 6, 2008
Messages
1,219
Location
Germany
I like your explanations, well done to understand the matter :twothumbs
 

HarryN

Flashlight Enthusiast
Joined
Jan 22, 2004
Messages
3,976
Location
Pleasanton (Bay Area), CA, USA
In a perfect world, constant current drivers are optimal for LED use. As a practical matter, I have built plenty of 1, 3, and 5 watt LED flashlights using resistors, as well as some desk lights from a DC power supply.

The extra steps required to make such a setup work are:
- The design needs to be slightly conservative vs running things at max
- A willingness to look at the "curves" of the power source and LEDs, and use the interaction of these curves in calculations, not just a single Vf and V bat number in the design
- A willingness to design your lights so that the LED is replaceable. LEDs used to be expensive (I remember being thrilled to pay $60 for a premium bin 150 lumens) but today similar capability is $ 1- 2. They are cheap / disposable so playing with them as a hobby is inexpensive.

A lot of LEDs are over driven beyond their mfg spec - a few seconds don't matter if you are purchasing from a tier 1 brand (Osram / Lumileds / Cree) and the LED has sufficient heat removal in the design.

You are right though - the margins do get tighter as the difference between Vbat and Vf get narrower. Fortunately modern batteries, such as CR123s and AA Li have pretty flat discharge curves, so it isn't complicated. In fact, I have heard very reasonable arguments that a resistor based LED light is likely to be more reliable as the part count is very low and no software is required.
 
Last edited:
Top