# Using PC Power supply for power LEDs



## wildstar87 (May 26, 2011)

So I'm in an electronics program now, and I'm playing around with stuff, I started getting interested in converting an ATX power supply to a bench top power supply to tinker with. Then I thought, hey I could probably use this to tinker with LEDs too. 

So I have a question regarding this. Since the Power supply has +5V or +12V obviously to run one LED I would have to step down the voltage, or would I? Ever since I started playing with the high power LEDs, everyone talks about current, and even a few people say voltage doesn't really matter. Now since I know a little bit about electronics (Ohm's law and the like) I know that reducing current is going to reduce voltage or vice versa anyway.

Can I just hook up a Rheostat in between the +5V or +12V wire and the LED, and use it to vary current, and the voltage will just line up? I know Rheostats are supposed to adjust current, so in my mind this seems like it should work, but don't know enough to say for sure. I would love to get some more educated folks to chime in on this.

Thanks!


----------



## mds82 (May 26, 2011)

you will still need a LED driver. the easiest thing i've found is to use a AD-DC converter, something like a 12v 1+ amp , and then use a power supply such as a buckpuck. fairly easy to do that way.


----------



## Oznog (May 26, 2011)

Resistors (including variable resistors) do work, but as you get into POWER LEDs, like 1W range, they become wildly impractical.

A 1W 3.2V white LED would need a resistor to burn up 3W here. That's an expensive resistor, and the system is only 25% efficient at delivering power to the LED. And that's just a 1W LED here. You probably will not find a 3W rheostat/potentiometer, common ones are well under 1W of power handling, and high power rheostats are rare and expensive.

That's why buck drivers became essential.


----------



## LilKevin715 (May 26, 2011)

Besides controlling the voltage (there is 3.3v orange wire too) you might have to take in account the amount of voltage ripple coming out of the PSU. I'm not sure if it would affect the LED and/or driver. I do know that max ripple allowed by the ATX spec is 50mv for the 3.3v & 5v lines, while the 12v allows a max of 120mv. This can vary accordingly on the quality of the SMPS.


----------



## 2xTrinity (May 27, 2011)

Oznog said:


> Resistors (including variable resistors) do work, but as you get into POWER LEDs, like 1W range, they become wildly impractical.
> 
> A 1W 3.2V white LED would need a resistor to burn up 3W here. That's an expensive resistor, and the system is only 25% efficient at delivering power to the LED. And that's just a 1W LED here. You probably will not find a 3W rheostat/potentiometer, common ones are well under 1W of power handling, and high power rheostats are rare and expensive.
> 
> That's why buck drivers became essential.


 
In situations where the power supply voltage is only slightly above the expected voltage dropped across the LED (say you have 3 LEDs in series, each dropping 3.3V for a total of 10V). You could use a linear regulator to control the current. Efficiency will be 12V/10V, or in general (Power supply voltage / voltage dropped in LEDs). This will basically work just like your idea with a rheostat, but one that is electronically controlled and stable. With passive resistors, brightness of the LEDs may drift as temperature goes up etc.

do some searches on CPF, there are some designs floating around such as this one: http://www.candlepowerforums.com/vb...-Simple-Power-MOSFET-Linear-Current-Regulator.


The trick is, place a resistor in series with the LED, called a sense-resistor. A 1/2-Ohm, 1-Watt, resistor for example. From Ohm's Law, for example when 1 amp of current goes through the LED, there will be 0.5V dropped across the sense resistor. 

Connect the voltage across the sense resistor, Vsense, to the inverting (-) input of an OpAmp. If your LED only requires ~50mA, you can drive the LEDs from the output of the OpAmp. Otherwise, use the OpAmp to drive the gate of a MOSFET in series with the LEDs. This will behave like a big variable resistor to limit the current.

Now, of you apply a voltage between 0-0.5V to the OpAmp noninverting (+), it will cause a current between 0-1A to flow through the LED. This is probably unclear without a posted circuit diagram (I don't have the means to post this now, let me know if your interested though)


----------



## doctaq (May 28, 2011)

if you were to make it into a benchtop with variable voltage you could use it at low voltage to test out connections or individual leds, up to 3 leds in series. 
i use my variable benchtop to test leds and setups, i wouldnt use it long term to power them as it is loud and overly expensive. 

led voltage is important but people talk about current because leds do not act like resistors, for instance, at 3v a led might draw .1A but at 3.5v it might draw .7A, not a linear relationship, people talk about current because different leds of the same type may have different voltage/current levels, it also changes with temp and age and history. high powered leds should be driven at a particular current (constant current) vs constant voltage


----------

