Quote:
Originally Posted by
2Quills http:///t/386777/led-s-are-the-best/100#post_3410221
I'm not even gonna go there lol.
That isn't actually my diagram, it's just one of many floating around that people have been using a general guideline for some of these builds when running parallel strings.
As far as I know the main reason why folks have been using the 1ohm resistors is merely for a simple way to test current going to each string with out having to open up the strings.
Yep, this is one other thing you can do. I occasionally build tube amplifiers and I use this method to measure bias current on the output tubes while the system is hot. 1 Ohm resistors work perfectly, because across a 1 ohm resistor, one amp = 1 volt on the meter.
Since balancing the forward voltage on each string is pretty much recommended (not necessarily totally needed) but is a good idea to keep the fixture balanced. From what I understand is that small differences in voltage or resistance typically equate to larger differences in current passing through each string.
Yes, it can make a big difference. Since LEDs are current driven devices not voltage driven, current makes all the difference. When you have a constant current supply, a finite amount of current is available, and when an imbalance exists, that can lead to significant differences in the lighting levels between strings, and even potential overloads on the string with the least resistance.
And over time the voltage or resistance can stray a bit so balancing the fixture again sometime down the line may not be a bad idea. But I think I see what you're getting at and wonder if different sized resistors might even be a better idea.
Sizing the resistor in this application depends on the maximum nominal differences you tend to encounter. In the case of this type of setup, if the wires are the only difference (slight variations in wire lengths, connections, etc), the differences are probably in the hundreths of ohms, in which case a 1ohm is more that sufficient. If the difference can be larger due to the LEDs, it may require something different.
Keep in mind that resistors have a voltage drop, and they dissipate that drop as heat, which means wasted power and efficiency, so you want to keep that resistor small.
In regards to the fuses. Agreed, in some cases they may not be necessary. Especially now that I'm wondering about different resistors. But with that said I think the idea of using them is primarily for folks running parallel strings. As you said, the power supply will supply a certain amount of current output no matter what. Say lets say you're running a power supply that puts out 5 amps, and you're running 5 strings in parallel that share those amps so you end up with 1amp of available current for each string. Now lets say you're running XR-E series cree leds rated for a maximum of 1amp. You use your variable voltage resistor (pot) or pwm signal to dim that output do approximately 90% or 4.5 amps total which breaks down to 900mA per string. Ok, everythings good, the fixture runs just how you want it. But all of a sudden one of your leds dead shorts (open) and you lose current passing through that string. Remaining current then gets distributed to the remaining 4 strings. Now you have 1.125 amp or 1125mA passing through your remaining led strings that are only rated for 1amp max. But you're in luck because you installed some 1amp fast blow fuses which blow and there for hopefully save yourself from losing or damaging a few dozen leds. I think that was the general idea for them.
I can see the argument here... I wasn't thinking about protecting the other strings, but if that's the goal, that's what you'd want to do. Sounds like you have this aspect (sizing the fuses and all) nailed down.
But if I'm thinking what you were thinking about the resistors initially. Then if you used the proper resistors, would they not just simply allow a limited amount of current from being able to pass through those remaining strings? Thus negating the need for fuses? Or am I thinking about this all wrong?
Yes and no. These constant current LED driver modules are a relatively new thing. Time was (and still is) that when driving an LED or a string of them, all you'd do is connect a series resistor that is sized to limit the LEDs current. As long as voltage stays constant - which any regulated power supply will do - then all you need to do is take the supply voltage, the forward voltage and current of the LED, and apply ohm's law to come up with the size resistor you need. In this manner, the resistor performed EXACTLY the same function as a metal halide ballast - to limit current.
However, what is going on here is the use of a constant CURRENT supply. The voltage output varies, it's the current output that remains steady. If you used series resistors for this, just as you pointed out above, if one goes out, the supply will increase the voltage to whatever is necessary to maintain the same current flow. Since resistor sizing is based on supply voltage, the resistors "become" too small and the strings can overload.