Detector Lowers LED Driver Stress When Power-Line Dimming

Sept. 23, 2010
$jq().ready( function() \{ setupSidebarImageList(); \} );

DIM signal

Circuit’s input voltage


LED-current waveforms

The typical low-voltage (24 V) lighting system uses a two-wire cable to connect the lamps to the off-line power supply, often located at some distance from the lamps. These systems can control lamp intensity by chopping the supply voltage. While this approach works fine for filament-based lamps, it can affect the reliability of LED lamps.

LED lamps require a dedicated circuit for controlling the LED current and, like most control circuits, the circuit requires a decoupling capacitor (C1) at the supply-voltage input (Fig. 1). This capacitor will alternately charge and discharge with each transition of the supply voltage. But using ceramic capacitors in this way can produce an annoying acoustic noise.

Electrolytic capacitors don’t have the acoustic problem, but the high inrush currents of the chopped supply can cause power dissipation in the capacitor’s equivalent series resistance (ESR). Because the ESR for electrolytics is higher than that of ceramic capacitors, this power dissipation can be great enough to reduce the capacitor’s lifetime, lowering circuit reliability.

You can avoid much of this charge/discharge cycle in the decoupling capacitor by simply switching the LED current off during the times the supply voltage is off. Most LED drivers (such as the MAX16832) have a dedicated input (DIM) that can be used to switch the LED current on and off rapidly, but require an extra control signal. In the typical two-wire system, however, there is no way to distribute that signal from the power supply to the LED lamps.

The solution is to have the LED-driver circuit in the lamp detect the start of off-time for itself and turn off the LEDs before the decoupling capacitor is heavily discharged. This circuit must also detect the start of on-time so it can turn the LEDs back on.

Figure 1 shows the simplest implementation of this idea in red. (Ignore the blue lines for the moment.) The supply line connects to the DIM signal. A diode (D) inserted into the supply line isolates the DIM signal from the decoupling capacitor. When the supply voltage turns off (i.e., the off-time starts), the DIM signal goes to logic zero and disables the LED driver. Because the decoupling capacitor no longer has to supply current for the LEDs while the power is off, the capacitor retains its charge.

In practice, this approach has several drawbacks. First, the diode introduces power dissipation equal to VF × ILOAD. Second, the system capacitance between the supply and the diode determines the exact moment the driver will switch off. If this capacitance is significant, the DIM signal will not drop instantly but will take some time to reach logic zero.

While the DIM signal is dropping, the decoupling capacitor supplies current to the LED circuit. This time interval can allow the decoupling capacitor to lose a lot of charge. The problem can be overcome by adding a load resistor to ground just before the diode, which will pull the DIM signal to ground rapidly. But that resistor also introduces unwanted power dissipation during the on-time.

Continue on next page

The circuit addition shown in blue is a better solution. Instead of inserting diode D and using the power line to drive the DIM signal, the circuit uses the combination of D2/C3 to form an envelope detector that slowly follows the input voltage.

The base-emitter voltage of T1 is positive during on-time, so T1 is off and its collector is at 0 V. T2, R3, and R4 form an inverter that converts this logic 0 to logic 1, turning on the LEDs via the DIM pin. The input voltage will drop rapidly at the start of off-time, but the envelope detector will respond more slowly. As a result, the base voltage on T1 drops faster than its emitter voltage.

T1 switches on when the base-emitter voltage reaches −0.7 V, causing the logic level at DIM to change from logic 1 to logic 0. This transition instantly switches off the LED driver, removing the load from the decoupling capacitor. The base voltage goes up again as on-time begins, switching T1 off and the LED driver back on. Inrush currents are much lower with this addition than with the unmodified circuit because the voltage fluctuations seen at the IN pin of the LED driver are reduced.

The performance improvements are easily measurable. Figure 2a shows the effect of a chopped input voltage with no effort made to protect the decoupling capacitor. Inrush currents show peaks larger than 12 A, and the input voltage present on the decoupling capacitor exhibits large oscillations. During off-time, the input voltage seen at the LED driver drops by more than 10 V.

Introducing the detection circuit (Fig. 2b) greatly reduces these values. Input current peaks are roughly 2 A, an improvement of a factor of six. Further, the input voltage at the LED driver shows much less fluctuation, now on the order of 2 V, and it is low enough to allow the use of inexpensive ceramic decoupling capacitors without generating audible noise.

The DIM signal (Fig. 3, channel 1) does show two glitchesdue to oscillations on the input voltage. One occurs at the beginning of the on-period, and a smaller one occurs at the beginning of the off-period. These glitches are too short, however, to affect the LED current as shown in Figure 4.


To join the conversation, and become an exclusive member of Electronic Design, create an account today!