LEDs will conduct more current when they get warmer and differences between individual LEDs mean you cannot easily put them in parallel. A constant current DC supply will be good enough for part of the LEDs but will overload some others. To normalize current a series resistor is used with each individual LED.

Now, those resistors waste a bit of power. Are they really necessary? If you put several LEDs in series the individual differences become negligible at some point and a constant current supply will suffice for several strips of series LEDs in parallel.

How many LEDs would this require? Another possibility would be to have the resistor in series with a strip of LEDs.

I got some LED strips off AliExpress that run on 12V and each individual LED has a resistor in series with it. I believe this to be quite wasteful and it would be better to have several LEDs in series with a current regulator instead. The LEDs will end up in an autonomous greenhouse where power efficiency is important.

  • More abstractly what you’re doing with the resistor is creating a very crude linear regulator, which is fine for most applications and if you’re careful about keeping your source voltage close-ish to the forward voltage of the LED this method can be fairly efficient.

    Using an active constant current supply (as an example or many dedicated LED driver ICs do something very similar) can be marginally better as it allows you to reduce the waste from the linear regulator.

    However, if efficiency is what you really care about you’ll need to go with a switching regulator. Here’s an app note going over the basics of that approach. and again you can usually find dedicated ICs for that approach.

    Overall I’d recommend doing a detailed power budget and really seeing whether it’s worth the cost/trouble of implementing that because while you are correct it is usually more energy efficient it can be significantly less labor/material/maintenance/longevity efficient (hence the prevalence of the humble resistor…)