# LEDs more or less electronically "noisy" than incan/CFL



## wildstar87 (Apr 15, 2008)

I was having a discussion with a friend the other night after seeing the thread on using LEDs for house light replacements. I'm a bit sensitive to noise on the electrical line since I use X-10 automation in my house, and it can be problematic at times. CFLs seem to make this worse, as does Low Voltage Halogen.

I theorized that LEDs would be less noisy, but my friend said because they are semiconductors that they would be REALLY noisy. Somehow this didn't make sense to me, but I dunno, I'm not an EE so I have no real clue.

Anyone with some actual knowledge on this, want to chime in? I imagine it would have something to do with the drivers as well obviously, but I was also wondering if there was something inherent in an LED that would make it so. :thinking:


----------



## TorchBoy (Apr 15, 2008)

My first thought is that the LED should be almost silent noise-wise. Its driver, on the other hand, might not be.


----------



## snarfer (Apr 15, 2008)

TorchBoy is entirely correct. The electronic noise really doesn't have much to do with the LEDs, CFLs, low voltage halogens, or whatever is actually emitting light, it has to do with the switch mode power supplies that power these devices. Most likely everything in your house with a switching supply will cause interference. That could include: electronic fluorescent ballasts, computer power supplies, battery chargers, audio/video equipment, etc... 

If you can figure out which the worst offenders are you could power them with isolation transformers. Other than that, good luck.


----------



## 2xTrinity (Apr 15, 2008)

wildstar87 said:


> I was having a discussion with a friend the other night after seeing the thread on using LEDs for house light replacements. I'm a bit sensitive to noise on the electrical line since I use X-10 automation in my house, and it can be problematic at times. CFLs seem to make this worse, as does Low Voltage Halogen.


CFLs are liekly to be the worst offender. Modern fluorescent tubes themselves are driven by a high frequency AC, around 25 kHz or so, not directly off of 60Hz like "old school" fluorescents for several reasons:

1) Higher efficiency
2) No visible flickering
3) No audible humming (25khz is higher than human audible range)
4) Smaller components, such as inductors and capacitors, are needed at higher frequency

They simply rectify all input coming in to DC, then use big transistors to "chop up" the DC into a high frequency AC square wave. This is prone to creating a lot of noise at 25kHz (or whatever the driving frquency of the lamp is) and all harmonics of that. 

This is one of the reasons I dislike retrofit CFLs --y permanently installed ballasts have better filtering and are usually not so bad for noise, but the cheap ballasts used in $1 CFLs are pretty poor quality, so skimp on all of that.



> I theorized that LEDs would be less noisy, but my friend said because they are semiconductors that they would be REALLY noisy. Somehow this didn't make sense to me, but I dunno, I'm not an EE so I have no real clue.


The fact that they are semiconductors is irrelevant -- they run on a constant current DC, so there is nothing about the LEDs themselves that is prone to creating noise. What causes noise is lots of abrupt switching, especially if the switching occurs at a frequency that is used by another device on the same circuit for sending signals.

My guess is that msot LED lights that already exist use poor quality power used to step down the voltage from 120AC to ~12DC (or whatever voltage the string of LEDs requires). For that matter, that would be the only reason a low voltage halogen should make noise -- the halogen itself is a simple resistor, most likely running on a steady 12VAC, in that case it must be the transformer driving it causing problems. 



> Anyone with some actual knowledge on this, want to chime in? I imagine it would have something to do with the drivers as well obviously, but I was also wondering if there was something inherent in an LED that would make it so. :thinking:


Short answer: no
If you wanted to, you could run an LED called gthe Acriche from SSC -- it runs directly off of 120VAC, and would not create any noise, but because it doesn't have a regulated power supply, it will flicker like crazy at 120Hz.


----------



## HKJ (Apr 15, 2008)

wildstar87 said:


> I was having a discussion with a friend the other night after seeing the thread on using LEDs for house light replacements. I'm a bit sensitive to noise on the electrical line since I use X-10 automation in my house, and it can be problematic at times. CFLs seem to make this worse, as does Low Voltage Halogen.
> 
> I theorized that LEDs would be less noisy, but my friend said because they are semiconductors that they would be REALLY noisy. Somehow this didn't make sense to me, but I dunno, I'm not an EE so I have no real clue.
> 
> Anyone with some actual knowledge on this, want to chime in? I imagine it would have something to do with the drivers as well obviously, but I was also wondering if there was something inherent in an LED that would make it so. :thinking:



Low voltage halogen does NOT make noise, but if your are using an electronic ballast, it will make noise, replace it with an old style trafo and your will have no noise.

Leds does not have to make noise, but the driver for leds can make noise. Usual your will only have a rectifier, and it makes a small amount of noise, but your could have a switch mode driver and it might make a lot of noise!

I.e. just saying it is leds, does not say anything about the noise level, your have to look at the design of the driver!

A driver like this:






only has a very small amount of noise.


----------



## 2xTrinity (Apr 15, 2008)

HKJ said:


> A driver like this:
> 
> 
> 
> ...



A driver like that, which is unfortunately the norm for most LED products for sale today, has several very significant problems:

*1) Poor efficiency* -- Your resistor is consuming 4 times as much power as your LED. 
*2) 120 Hz flicker* -- Worse than magnetic fluorescents -- No capacitor on the output side means the diode is receiving 120Hz pulsating DC, rather than steady DC.
*3) Low power factor *-- The capacitor in series on the AC side limits power consumption in your LED by shifting the AC voltage to be out of phase with the AC current. Power consumed = (Current) (Voltage) (Power Factor). 

In the above circuit, power factor is about 0.05, which means compared to a simple resistor dissipating ~100mW (same as your LED/bridge combo), your wires will need to carry 20 times as much current. If the power factor for your whole house is too low, the power company will charge you extra as it costs THEM money to handle all that extra current on their wires too.


Instead of a circuit like that, I'd reccomend a traditional "wall wart", which uses a magnetic transformer to step the voltage down (and not a big capacitor), then runs it through a filtered bridge rectifier to produce a constant DC output. Magnetic wall warts will produce very little noise, but they are also not regulated, so a series resistor or linear regulator like this will be needed between the wall wart, and the LED.

I've also used the "lightweight" wall-warts used to charge cell-phones to drive LEDs. Those use switching power supplies, and are generally better regulated and more efficient, but they're also more likely to put noise on your household circuit.


----------



## VidPro (Apr 15, 2008)

and just to add, the Filtration level is important too, whatever the heck that is :thinking:.
we had the worst time with X-10 Type of through the power line RF transmission with all the surge protectors filtering stuff.
so mabey its not always the noise , but something they use in a curcuit to reduce the noise, that kills your RF .
i donno, but even if we had all the x-10 type of stuff, free from surge protection connection direct, when the surge protection items were plugged in, it would lose the connection to the modules :-(
and lots of this stuff that makes noise, has some sort of filter junk to pass FCC or CE inspection , so it doesnt cause inteferance in general, and back into the power line.

so i am saying it might not always be about the item making noise, but what they might have done to filter noise they do make, when they are on the line, filtering the X-10s RF too.


----------



## 2xTrinity (Apr 16, 2008)

VidPro said:


> and just to add, the Filtration level is important too, whatever the heck that is :thinking:.
> we had the worst time with X-10 Type of through the power line RF transmission with all the surge protectors filtering stuff.
> so mabey its not always the noise , but something they use in a curcuit to reduce the noise, that kills your RF .


I hadn't thought of this, but that's a good point. Power conditioners, such as those used to remove noise from the line for things like home theater equipment, use a series of filters to remove everything except the basic 60Hz sinusoid, including the signal from X10 systems. Likewise, if a device is prone to make noise at a frequency similar to the one used by X10, not only would genrating that noise be a problem, implementing circuitry to suppress noise at that frequency could also suppress the X10 signal.

I remember we had a similar issue one time where a relative of mine was trying to switch to dimmable CFLs, on a X10 based dimmer. There had to be at least one traditional incandescent. This is because since incandescents are literally just a "hot wire", the signals over the power lines can actually transmit completely "through" the incandescent lamps, to act as a feedback pathway. CFLs however, which have a driver circuit, would obviously not "pass through" any signals.


----------



## HKJ (Apr 16, 2008)

2xTrinity said:


> A driver like that, which is unfortunately the norm for most LED products for sale today, has several very significant problems:
> 
> *1) Poor efficiency* -- Your resistor is consuming 4 times as much power as your LED.
> *2) 120 Hz flicker* -- Worse than magnetic fluorescents -- No capacitor on the output side means the diode is receiving 120Hz pulsating DC, rather than steady DC.
> *3) Low power factor *-- The capacitor in series on the AC side limits power consumption in your LED by shifting the AC voltage to be out of phase with the AC current. Power consumed = (Current) (Voltage) (Power Factor).



1) This circuit has a very good efficiency for driving a single (low power) led or a couple of leds in serie. It would be hard to reach the same efficiency with a transformer, because the transformer also has looses. 
If want to drive many watts of leds, then your would need a better circuit

2) Correct, but few people has a problem with that.

3) The low power factor does not matter for a 0.1 watt circuit, I believe that in Europe your have to be above 60 watt, before your has to look at the power factor.




2xTrinity said:


> Instead of a circuit like that, I'd reccomend a traditional "wall wart", which uses a magnetic transformer to step the voltage down (and not a big capacitor), then runs it through a filtered bridge rectifier to produce a constant DC output. Magnetic wall warts will produce very little noise, but they are also not regulated, so a series resistor or linear regulator like this will be needed between the wall wart, and the LED.



Your would get rid of the 120 Hz flicker, but would might lower the efficiency.


----------



## 2xTrinity (Apr 16, 2008)

HKJ said:


> 1) This circuit has a very good efficiency for driving a single (low power) led or a couple of leds in serie. It would be hard to reach the same efficiency with a transformer, because the transformer also has looses.


This is true enough. transformers would only be effeicient at a fairly high fraction of their peak load. for efficiency, switching mode supplies, such as the ones used for cell phone chargers, are by far the most efficient -- even 



> If want to drive many watts of leds, then your would need a better circuit


This is what I think the original poster is going after -- as he was talking about LEDs in the same sentence as halogens and CFLs, essentially genreal light sources of at least several watts. There are larger LED devices on the market that use circuits just like the one you've show. For example, LED Christmas light strings commonly consume ~0.5W in the LEDs, and ~2W in resistors, with fairly low power factors. Basically the biggest efficiency advantage over incans at that point is from lack of color filtering losses. 



> 2) Correct, but few people has a problem with that.


As an indicator probably not, but for general lighting It would be quite problematic.



> 3) The low power factor does not matter for a 0.1 watt circuit, I believe that in Europe your have to be above 60 watt, before your has to look at the power factor.


True, I was thinking more of applying a design like this to a larger circuit. If you were to scale this up to LEDs consuming a few watts, you'd be requiring a considerable amount of current. 



> Your would get rid of the 120 Hz flicker, but would might lower the efficiency.


LEDs tend to be more efficient at constant current, than at a pulsating current with the same RMS value. The efficiency loss due to adding the filters could very well be made up for by increasing efficiency in the LED itself.

The little switching power supplies used in cell phone chargers are certainly quite efficient-- they don't warm up noticeably, even when run at full load, and by paring the right one with the right LED you can actually get away with direct drive for even better efficiency.


----------



## HKJ (Apr 16, 2008)

2xTrinity said:


> There are larger LED devices on the market that use circuits just like the one you've show. For example, LED Christmas light strings commonly consume ~0.5W in the LEDs, and ~2W in resistors, with fairly low power factors. Basically the biggest efficiency advantage over incans at that point is from lack of color filtering losses.



It is not a good idea to compare to xmas lights, the incandescent used there at run at a very very bad efficiency. 
Again, as long as it is low power lights, it is a good circuit. But for a 10 watt light, it might be a good idea to find something else.

And in the first place, this thread was about noise from leds, and this circuit does not generate much noise.





2xTrinity said:


> The little switching power supplies used in cell phone chargers are certainly quite efficient-- they don't warm up noticeably, even when run at full load, and by paring the right one with the right LED you can actually get away with direct drive for even better efficiency.



Yes, a small switch mode supply and only dropping 1 volt over a resistor would be a very good light. Then again, if your are designing a switch mode supply (as a manufactur), why not make it with constant current output, making it possible to drive the leds directly.

But the problem with switch modes supply can be noise and harmonic currents. But again it depends on the design!


----------



## mdrejhon (Apr 22, 2008)

I think high-quality flicker-free dimmable LED bulbs, compatible with today's wall dimmers, need to come out soon.

I think this is doable - LED's can be to be driven by a very efficient switch-mode power supply that has the ability to be powered off triac'd AC and various types of dimmed AC (including chopped AC, as well as PWM). Basically, convert the triac dimming into LED compatible PWM dimming. So you have 0% through 100% LED dimming, with the ability to run off a standard wall dimmer, as well as standard X10 and Insteon units.

Through more sophisticated driver electronics, this is definitely possible to do, and still achieve very good PFC (at least as much as the driver's efficiency is concerned, independent of the wall dimmer -- triac dimmers don't have very good PFC, while other types of dimmers have much better PFC). 

Some wall dimmers need a low enough load resistance before the triac triggers to on. So you'd need enough wattage to trigger the dimmer, or there are also dummy 'load surge' simulators that activate for the first time the light is turned off (these dummy load surge simulators are used in one of the dimmable compact Fluorescent, to trigger the triac to turn on, when the light is turned on for the first time at dimmed levels) -- this is to make sure that the triac is turned on at low brightness levels because wattage of CFL and LED bulbs are often too low to activate the triac, when you're trying to design a dimmable bulb.

Essentially, the input electricity would be absorbed at a good PFC into the switch-mode power supply (electronics and capacitors), while other electronics would measure how much dimming is required (based on how the input electricity is currently being modulated) and filter and re-modulate the input electricty to a high frequency PWM for the LED, enabling 0% through 100% incandescent-style dimming in a LED lightbulb, off a standard wall dimmer. The PWM should be driven at the tens of kilohertz level minimum, much like for a CFL, for best results.

Smart electronics converting standard dimmer-modified power (i.e. triac'd) into high-frequency power (in one form or another) for the light has been done before -- I think this is sort of how some of the better dimmable CFL's operate -- such as the European Varilight CFL, which can dim to 2% its original brightness. But 2% brightness is too bright for many situations. Also, I think a small red LED in the white LED fixture (or the use of a warmer white LED in conjuction with neutral LED -- or the use of RGBW LED arrays) could be useful to simulate the 'reddening' of incandescent as it is dimmed. In theory, doing things this way could also could also provide for white lighting that has adjustable color temperature -- which could be convenient.

Well designed switch-mode power supplies can have PFC of 0.99 -- much like recent brand-name computer supplies, such as my OCZ 750W power supply, and inject virtually zero interference into the automation system, no need for a X10 plugin filter for my computer. Good switch mode power supplies cost more money though.

In the long term, these bad-PFC triac-based wall dimmer fixtures will need to be replaced by better, more efficient dimmers with good PFC, but future dimmable household LED lighting should be designed to be compatible with as many of these dimmers as possible -- it is doable, as mentioned above.


----------



## 2xTrinity (Apr 23, 2008)

> Essentially, the input electricity would be absorbed at a good PFC into the switch-mode power supply (electronics and capacitors), while other electronics would measure how much dimming is required (based on how the input electricity is currently being modulated) and filter and re-modulate the input electricty to a high frequency PWM for the LED, enabling 0% through 100% incandescent-style dimming in a LED lightbulb, off a standard wall dimmer. The PWM should be driven at the tens of kilohertz level minimum, much like for a CFL, for best results.


LEDs run bestter on constant current, rather than PWM if possible. Basically, the circuit you described, plus an additional level of smoothing capacitors on the LED-output would be a pretty amazing product.

However, there are significant problems that will hinder LEDs capable of competing with CFLs. The biggest is heat, which is even a problem for recessed R30 CFLs. Some compensate by using a different mercury amalgam, but this directly leads to the "long warmup time" problem.

LEDs are much more sensitive to heat than fluorescents. The only solution I can think of is not a very pleasant one -- automatically cap how bright the bulb can run if it gets too hot. That will **** a lot of people off.



> a small red LED in the white LED fixture (or the use of a warmer white LED in conjuction with neutral LED -- or the use of RGBW LED arrays) could be useful to simulate the 'reddening' of incandescent as it is dimmed. In theory, doing things this way could also could also provide for white lighting that has adjustable color temperature -- which could be convenient.


That's a pretty nightmarish task for an already complicated circuit, that's supposed to fit in the size of a light bulb and sell for a few bucks. White balance is quite difficult to pull off, especailly as different color LEDs have different responses to temperature. You'd have to have either optical feedback, or an exhaustive table of your LED's efficiencies vs. temperature. 

I'm certainly not against color temp shifting, I actually think having a light that would run at ~5000k during the daytime, to match incoming sunlight in the windows, yet shift to ~3500k at night would be neat. But something that complicated IMO should be a feature in a dedicated fixture, NOT a retrofit.

As for color shifting incandescents on dimmers, most of the time I'd consider that a negative side effect. For example, if there are multiple "groups" of lights on dimmers, some running at full intensity, others at partial, I would much rather them all be at the same color temperature for the sake of uniformity. 



> In the long term, these bad-PFC triac-based wall dimmer fixtures will need to be replaced by better, more efficient dimmers with good PFC, but future dimmable household LED lighting should be designed to be compatible with as many of these dimmers as possible -- it is doable, as mentioned above.


What I am afraid of is that all the attention will go to shoehorning LEDs into the incan retrofit market, and there won't be very many good dedicated LED fixtures available at reasonable prices. This has long been true with fluorescents -- I personally would like to see purpose-built compact- and linear-fluorescent fixtures with dimming ballasts specifically marketed for home use, but virtually none exist.

With LEDs, rather than keeping my old fixtures running, 
I'd rather see energy go into making flush-mounted LEDs on the ceiling with highly efficient TIR optics at just the right angle to evenly light up say a desk, workbench, or table with NO wasted light, or ambient lighting provided by RAGB OLED "wallpaper" coating the entire ceiling.


----------



## mdrejhon (Apr 23, 2008)

2xTrinity said:


> LEDs run bestter on constant current, rather than PWM if possible. Basically, the circuit you described, plus an additional level of smoothing capacitors on the LED-output would be a pretty amazing product.


Constant current is fine, as long as the smoothing capacitors don't interfere with the final brightness of LED. There's pretty narrow range of voltages and the LED often changes color at different voltages. Basically, a very dim green LED sometimes becomes yellowish or orangeish. 

But you are right, constant illumination is better for many reasons. If the LED is adequate for the purpose of driving at a constant current, that works fine. PWM, however, is a lot easier to control predictably: It provides much more linear LED dimming without change in LED tint, even if the efficiency is not as good from a lumens/W perspective. Then again, you don't really need 100% linearity, as long as you can somewhat kinda stretch the LED's entire dimming range over nearly the entire range of the wall dimmer... (i.e. LED becoming 0% when the dimmer switch is pulled down to 20% or thereabouts -- much like what happens with incandescents, partially because the current is no longer enough to make the filament hot enough to glow. The leakage current that's not hot enough to light up a tungsten filament, should still be enough to power the electronics of a dimmable LED, to still allow full "dim-down-to-0%" controllability)

BTW -- I don't really like PWM at low frequencies. People can indirectly detect PWM if it's run at only hundreds hertz or even the low single-digit kilohertz. Essentially, just roll your eyes around very fast in front of a flickering LED source (such as an old red LED alarm clock, or LED Xmas lights) -- or just pick up and shake a lit LED Xmas light string; you easily detect the 120Hz with the 'shimmer effect' it causes (60Hz x 2 - when counting both AC crossings); dotted blurs rather than straight line blurs; or stroboscopic wagon wheel effects; like spinning a fast-flickering LED glowstick -- many of those flicker faster than 200Hz yet one can detect discontinuity of its illumination by its effects during motion. I even get annoyed at certain cheap PWM-driven car/bus taillights that are driven at low PWM frequencies such as only 200 Hz (some of these taillights use PWM for the dimmed non-braked mode, and disable PWM when braked) - sure they don't flicker when I stare at it, but when I flit my eyes around fast, the dotted blur becomes visible in my vision field. Thus, proper PWM control of a LED light should be at least in the several tens of kilohertz, or hundreds of kilohertz, to eliminate any distracting effects. However, care must be taken to not produce interference caused by PWM (for example, infrared remote controls transmit at 38Khz, you don't want a PWM LED at near that frequency, or at harmonics/multiples thereof...) Once you get into the tens and hundreds of kilohertz, PWM becomes essentially a nonissue for the purposes of human vision.

One problem I can see with high-frequency PWM for example is beat effects between two fixtures. One PWMing at 100,000 Hz, and the other PWMing at 100,001Hz varying due to manufacturing tolerances. These two frequencies would be only 1 Hz apart. That's going to cause the room's lighting to flash at 1 Hz to the beat effect. One solution is to just sync it to a big multiple of the AC (but that won't work with all types of dimmers), or randomize the PWM frequency everytime you turn on the light... All kinds of tradeoffs manufacturers of household LED fixtures will have to make decisions on... Constant current may ultimately be the way to go, as long as the problems caused by that are solvable....



> LEDs are much more sensitive to heat than fluorescents. The only solution I can think of is not a very pleasant one -- automatically cap how bright the bulb can run if it gets too hot. That will **** a lot of people off.


Should be possible to just slowly dim the LED over minutes as the lamp overheats, until the temperature is within acceptable parameters. In a well designed fixture, the dimming would be too slow to be noticed by most people. Many often don't notice when 100 watt is dimmed down the equivalent of 60 watt light output, if done in an imperceptibly slow manner, or if you turn off a room for 5 minutes (to forget how bright it was), slightly adjust dimmer down and then turn the room back on. Could be just a simple thermosister feedback mechanism. The bigger the heatsink, the slower the dimming becomes when the fixture is forced to be recessed or similiar. May still be annoying, but at least it's a compromise, especially if first bulbs get many returns due to failures and manufacturers need to do something similiar to this to prevent loss by warranty claims...



> I'm certainly not against color temp shifting, I actually think having a light that would run at ~5000k during the daytime, to match incoming sunlight in the windows, yet shift to ~3500k at night would be neat. But something that complicated IMO should be a feature in a dedicated fixture, NOT a retrofit.


Perhaps you're right -- cheap fixtures should at least just aim to merely become easily dimmer compatible (full range dimming), with no worry about dimming linearity and color temperature shifting.

Just so many ideas and potential that LED can do -- they are potentially way more flexible than CFL in respect to color control.

I agree, I fear that the first mass-market LED bulbs that hits the big stores, may be of dubious quality (in one way or another), especially as they will initially be overpriced to the "Average Joe"...


----------



## SemiMan (Apr 23, 2008)

mdrejhon said:


> Thus, proper PWM control of a LED light should be at least in the several tens of kilohertz, or hundreds of kilohertz, to eliminate any distracting effects.
> 
> One problem I can see with high-frequency PWM for example is beat effects between two fixtures. One PWMing at 100,000 Hz, and the other PWMing at 100,001Hz varying due to manufacturing tolerances. These two frequencies would be only 1 Hz apart. That's going to cause the room's lighting to flash at 1 Hz to the beat effect.



I highly doubt (actually I am pretty darn certain) that almost no-one will be able to see any visible flickering beyond a few kilohertz and even at 1KHz you would be hard pressed to create the circumstances for it. 100KHz PWMing would make little sense it is simple overkill. You would generate needless electrical noise. You would add inefficiency from FET switching. Last, you would increase the complexity of your switch mode constant current regulator needlessly.

On two frequencies of PWM off by a HZ, you would notice absolutely nothing. The effects of the two lights are additive. There is no modulation effect that would create a 1Hz beat frequency.

Semiman


----------



## 2xTrinity (Apr 23, 2008)

mdrejhon said:


> Constant current is fine, as long as the smoothing capacitors don't interfere with the final brightness of LED. There's pretty narrow range of voltages and the LED often changes color at different voltages. Basically, a very dim green LED sometimes becomes yellowish or orangeish.


A few relevant graphs lifted from the "constant current vs PWM dimming" thread:





Basically, unless the light is going under 1%, CC is definitely more efficient. Incans become useless long before 1% output due to color shift, anyway, so that's probably a good design limit.

Color shift of each of the respective methods:








Unlike green LEDs, white LEDs have very minimal color shifting effects because the phosphor behaves just about the same regardless of the shift in the blue LED. Also, the eye doesn't really detect a +5nm wavelength shift in the blue range as a color change, while the same change in the yellow/green region would be noticeable.



> Then again, you don't really need 100% linearity, as long as you can somewhat kinda stretch the LED's entire dimming range over nearly the entire range of the wall dimmer... (i.e. LED becoming 0% when the dimmer switch is pulled down to 20% or thereabouts -- much like what happens with incandescents, partially because the current is no longer enough to make the filament hot enough to glow. The leakage current that's not hot enough to light up a tungsten filament, should still be enough to power the electronics of a dimmable LED, to still allow full "dim-down-to-0%" controllability)


There will have to be a "gamma" setting on the LED dimming to account for the fact that LEDs get more efficient when dimmed and incans get less efficient. I've seen dimmable CFLs output roughly 10% output when the dimmer is set to "0" before...





> BTW -- I don't really like PWM at low frequencies. People can indirectly detect PWM if it's run at only hundreds hertz, or even the low single-digit kilohertz.


Hundreds of hertz range I can detect directly if there is just about any motion involved. The only way I can see the 7800Hz flicker on my LF2 though is by setting duty cycle to 0.2%, and looking at something like a high speed fan - and even then I see a "pulsating" line (not even discrete "dots"). It's basically a non-issue. The 100Hz on my previous EDC was annoying in just about any circumstances, however.



> I even get annoyed at certain cheap PWM-driven car/bus taillights that are driven at low PWM frequencies such as only 200 Hz (some of these taillights use PWM for the dimmed non-braked mode, and disable PWM when braked) - sure they don't flicker when I stare at it, but when I flit my eyes around fast, the dotted blur becomes visible in my vision field.


Those are horrendous -- While red LEDs do color shift slightly more than blue (And especially more than white), I'm sure they'd still be within legally tolerable ranges if they used CC. Filtered incans certainly vary from brand to brand. Barely perceptible color change would be far less objectionable than flickering for other drivers.



> I agree, I fear that the first mass-market LED bulbs that hits the big stores, may be of dubious quality (in one way or another), especially as they will initially be overpriced to the "Average Joe"...


This could lead to credibiltiy issues, if early adopters have bad experiences and refuse to trust good LED fixtures that come out later. Many people still loathe compact fluorescent [supposedly] because of the 120Hz flicker due to bad expierences with magnetic fluorescents in the past.


----------



## mdrejhon (Apr 29, 2008)

2xTrinity said:


> A few relevant graphs lifted from the "constant current vs PWM dimming" thread:


Very interesting -- yes, I would be inclined to agree that constant current is all we need for dimming. The linear brightness capability is more useful for, say, computer displays. Many backlights, including for cellphones, use PWM for dimming, but this wouldn't necessarily be the best way as you've indicated. Gamma would be a big help. 



> Hundreds of hertz range I can detect directly if there is just about any motion involved. The only way I can see the 7800Hz flicker on my LF2 though is by setting duty cycle to 0.2%, and looking at something like a high speed fan - and even then I see a "pulsating" line (not even discrete "dots"). It's basically a non-issue. The 100Hz on my previous EDC was annoying in just about any circumstances, however.


Direct versus indirect, is really just semantics. Personally I define flicker detection via motion, as an indirect form of detection, but in another perspective it may be direct. (I define direct detection via just staring the light; you can't detect 100 Hz flicker easily without motion involved).

If there is really fast motion in a tiny-enough bright-enough point-source (like a fast moving pulsating 5mm LED, at the edge of a spinning bicycle wheel), I can even detect flicker up to the low thousand kilohertz -- 2 Khz can be detected quite easily if the flickering LED is moving fast enough, i.e. at the edge of a fancy rim. A LED flickering 2 Khz on a wheel spinning 10 cycles per second, would have 200 dots around its outer circumference. 

One may have have seen those LED batons and spinning LED signs that display text in midair; the flicker on those can go up to the kilohertz range in order to generate 100 pixels of horizontal resolution using, say, 20 or 30 passes per second -- that's 20x100 and 30x100, for a pixel frequency of approximately 2Khz or 3Khz.


----------



## mdrejhon (Apr 29, 2008)

SemiMan said:


> I highly doubt (actually I am pretty darn certain) that almost no-one will be able to see any visible flickering beyond a few kilohertz and even at 1KHz you would be hard pressed to create the circumstances for it.


While there's point of diminishing returns the higher frequencies you go, there's still potential for nasty stroboscopic effects, such as with a kitchen fan, or a fast-spinning bike wheel, and similiar. I still recommend the tens or hundreds of kilohertz, it can be done with minimum RFI.

As for additive light, you're right -- I should have remembered that (duh). It's not like two inaudible ultrasonic frequencies causing an audible beat frequency, nor it's anything like laser light diffraction.


----------



## snarfer (Apr 29, 2008)

Personally what I'm most afraid of is stroboscopic effects that I can't see, but that a camera can. Perhaps I'm being overly cautious, but I really can't see using LED light driven at less than 10kHz PWM professionally. Might end up with a flicker on the film that wasn't visible to the naked eye, just due to some weird stroboscopic interaction between LED flicker rate and shutter speed. It happens a lot with HID lighting, and often ends up costing money in post or, worse yet, for a reshoot. Cameramen and lighting directors get fired for things like that.


----------



## 2xTrinity (Apr 29, 2008)

snarfer said:


> Personally what I'm most afraid of is stroboscopic effects that I can't see, but that a camera can. Perhaps I'm being overly cautious, but I really can't see using LED light driven at less than 10kHz PWM professionally. Might end up with a flicker on the film that wasn't visible to the naked eye, just due to some weird stroboscopic interaction between LED flicker rate and shutter speed. It happens a lot with HID lighting, and often ends up costing money in post or, worse yet, for a reshoot. Cameramen and lighting directors get fired for things like that.


Though it's a bit counterintuitive, I think your photography would work better with a much LOWER PWM frequency than 10kHz, like 0Hz  




> Direct versus indirect, is really just semantics. Personally I define flicker detection via motion, as an indirect form of detection, but in another perspective it may be direct. (I define direct detection via just staring the light; you can't detect 100 Hz flicker easily without motion involved).


There's a difference between detecting PWM due to special cases of motion, such as specular reflections off of fast moving fans wheels, and detection from ANY motion at all, including moving my hands, shining the light at falling water, turning my head while the light is on, etc. all of hich are very noticeable at 10% duty cycle 100Hz PWM.


----------

