Question Solar mining rig power usage

Aug 24, 2019
5
0
10
I have a single GPU mining rig that runs of solar when weather permits, and I've noticed that when it's powered by solar it uses just over 2a and when it switches to grid power it's about 3a. Temperatures and hashrates remain the same at the lower amperage, but I'm curious why this is, and how I might lower the power consumption when on grid power?

I'm using a windy nation 400w solar kit with included 30a charge controller
Bestek 500w inverter
Spartan TS4500 transfer switch
100ah deep cycle battery

This is my first post so not sure if it's in the right area. I did some searching online and came across this site so figured I'd give it a shot.
I appreciate all input, and thanks in advance.
 
Remember that amps are only half the equation. The amount of power (watts) you use is amps*volts. Thus 2.4 amps @ 140 volts is the same as 3 amps @ 110 volts. Since you 2 and 3 amps (rather than 2.0 amps) I assume there is some rounding which could mean your 2 reading could be a lot closer to each other than you think.
 
Aug 24, 2019
5
0
10
Thanks for the reply, and your correct, some rounding was involved. The mining rig is plugged into a surge protector through a Spartan power meter, and when it's powered off solar it's reading 2.089a and 209w and when it switches to grid power it's 2.998a and 217w. The numbers fluctuate a bit but that's a pretty close estimate. I'm not sure why the power usage would vary though, seems amps and watts should be the same irregardless of where the power comes from? Solar and grid power both run through the transfer switch and then through the power meter to the miner, so the readings should be the same right?
 
209 Watts vs 217 Watts ... about a 3% difference. I think you can chalk that up to power supply efficiency. Using your numbers, your solar system is putting out electricity at 100 V. Your house power is 72 V ... hmmm, I'm gonna guess you don't live in the United States.
 
Ummm .. something is wrong then. US standard is 120V to the house ... yes, you hear people say 110V, but that is outdated. I use a "Kill a Watt" box to measure usage and my house (Texas) is getting 121V right now. I have a solar system on the roof and I will hit 125V when it's running.

The Spartan seems very similar to my Kill a Watt and looks to have a voltage reading. You should confirm the numbers because 72 V seems extremely low for anyone on the US grid.
 
Aug 24, 2019
5
0
10
It shows 115v on grid power and 114v on solar. So when it's running off grid power it's 2.9a 217w and 115v, and 2.08a 209w, 114v on solar. I also have a couple teckin smart plugs that monitor power usage (one plugged into the wall, and one on the inverter) that show similar numbers when they work. The power monitoring feature is kind of a joke it seems and I don't really trust them.
 
115 V grid power and 114 V solar make a lot of sense.

2.9 amps times 115 Volts equals 334 Watts ... not the same as your box is reading.
2.08 amps times 114 Volts equals 237 Watts ... pretty close to what your box is reading.

Got to shake my head ... there is nothing I can think of that would give you a 100 watt difference in power usage. I assume you are measuring at the plug that goes to the computer. The extra 100 watts are not going to charge the battery or something?
 
Aug 24, 2019
5
0
10
Yep, measuring at the plug, and also at the wall, and the inverter. And I know the solar side is capable of providing higher wattage, just not sure why it doesn't. I'm not complaining, just curious.