The above URL contains the entire discussion. Below is a snippet.
We plan on using the LED linked below for an application. The chip is comprised of three individual diodes. Simple on/off control of the entire chip, no pulsing. Sometimes the LEDs can remain on for long periods of time.
I assume the absolute maximum ratings are for the chip as a whole - is that correct? Is the maximum continuous forward current of 30mA for the whole chip as well then? If I wire the diodes in series, 30mA results in a power dissipation well above the 100mW stated limit. The recommended forward current is 60mA - this must be for pulsing applications?
For reasons I won’t get into, the chip must be powered through a single resistor (rather than one per diode). The source voltage is 24VDC. Considering that, I conclude we should wire these in series. In that case a forward current of about 15mA results in a total power dissipation of 100mW. Is that a reasonable conclusion?
Max ratings are for the chip as a whole… all 3 LEDs lit can not exceed 100mW. The 60mA rating is for the measured values (like dominant wavelength and intensity), but they expect you to run the chip at 30mA (which is really stupid, IMHO). For pulsing, they’ll allow up to 100mA, but that’s only at a 10% duty cycle. You could pulse at 60mA with increased duty cycle, but you’ll want to stay under the max power limit.
I would not run this package at 100mW, regardless. Even if you could guarantee an ambient temp of 25C, your lifetime will be severely shortened. You also can’t guarantee the nominal voltage of 2.2V, so you’re bound to have devices with higher pin voltage, which means you’ll be above the 100mW max.
Your “source voltage” is irrelevant as LEDs are current-based devices… I expect you’ll be burning the 17V+ somewhere else (e.g., resistor) if you plan on connecting it directly to the voltage source.
I agree fully with Mac that 30mA is all you should run thru each LED.
I’d run them in series.
It’s a shame you’re going to be dumping 3X more energy as heat than what you’re driving the LEDs with. Much better would be a commercial LED driver chip that takes your 24Vdc and sets a fixed current that will protect the LEDs and limit the current.
LEDs are supposed to be driven with a current source not a voltage source. The reason for this is because an LED is a non-linear device with a POSITIVE temperature coefficient. This means if something happens like the LED ages-out or gets thermally insulated because a towel falls on it or it comes loose from its mount or gets static damaged or ?? and it heats up a bit more than usual its resistance drops. This means that if it’s hooked to a voltage source it will immediately draw more current, which will heat it up more, dropping its resistance further, causing it to draw more current… I suspect you can see where this chain of events is going?
I was a referring to a switching controller something like this