Alright i have a bit of an LED light project going on. The LEDs are high power and require cooling.
I have these perfect little turbine style fans from old DELL server racks i've been too sentimental towards to throw away.
The fans have two... fans built in, one spins clockwise the other counter, with blade designs to make that work nicely. They are super focused and have a massive static pressure (i'm guessing) for their use in 1U servers.
So i hooked the wires from each individual fan motor into one, so i have them in series. But they are probably too noisy and fast for my needs.
The fans are marked 12v .68A. Holy crap! So i hook them up to a 12v power supply and like i thought - way too loud and more powerful than i need.
So i put a resistor on it. They are only 2W but they are rated for high voltage and are a nice square ceramic variety.
Still not quite enough so I put another resistor on it.
Now it is perfect... but my question is... I am making my fans slower by drawing MORE power?
See my logic is that i have
12v*.68a=8.16W
Then i add the resistors to it.
8.16W+2W+2W=12.16W
So wait - did i really just make my fans slower by sucking in MORE power?
Or did I rob the fan of current some other way? Does it work more like this?
12v*(.68a(5/6)*(5/6))=(12v*.47a)=5.64W But the resistors generate heat so this math doesn't make sense, some energy must be lost.
This is the thing i don't get.
What using a voltage buckbooster be a better plan? I could drop a 12v power supply down to maybe 8v.
TLDR: Does using resistors to slow down fans increase over all power demand? Would using a buckbooster to alter voltage and leave current alone be a better option?
I don't care how i make the fans slower - i just want to make sure i'm not wasting energy for a lesser result which seems silly.
I've been working with computers forever but this is the first project i've done that is very involved with DC electronics.
Please only respond if you have a very clear and concise answer.
I have these perfect little turbine style fans from old DELL server racks i've been too sentimental towards to throw away.
The fans have two... fans built in, one spins clockwise the other counter, with blade designs to make that work nicely. They are super focused and have a massive static pressure (i'm guessing) for their use in 1U servers.
So i hooked the wires from each individual fan motor into one, so i have them in series. But they are probably too noisy and fast for my needs.
The fans are marked 12v .68A. Holy crap! So i hook them up to a 12v power supply and like i thought - way too loud and more powerful than i need.
So i put a resistor on it. They are only 2W but they are rated for high voltage and are a nice square ceramic variety.
Still not quite enough so I put another resistor on it.
Now it is perfect... but my question is... I am making my fans slower by drawing MORE power?
See my logic is that i have
12v*.68a=8.16W
Then i add the resistors to it.
8.16W+2W+2W=12.16W
So wait - did i really just make my fans slower by sucking in MORE power?
Or did I rob the fan of current some other way? Does it work more like this?
12v*(.68a(5/6)*(5/6))=(12v*.47a)=5.64W But the resistors generate heat so this math doesn't make sense, some energy must be lost.
This is the thing i don't get.
What using a voltage buckbooster be a better plan? I could drop a 12v power supply down to maybe 8v.
TLDR: Does using resistors to slow down fans increase over all power demand? Would using a buckbooster to alter voltage and leave current alone be a better option?
I don't care how i make the fans slower - i just want to make sure i'm not wasting energy for a lesser result which seems silly.
I've been working with computers forever but this is the first project i've done that is very involved with DC electronics.
Please only respond if you have a very clear and concise answer.