News Super Flower’s beastly 2800W power supply lands at $899 — enough juice to power a couple of RTX 5090 GPUs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
..and there your wrong. But why in the world is anyone running 30a 240v for standard electronics? 15a is all you need and only needs standard wiring. In my house, white plugs are 120v, 15a and black plugs are fed by a whole house UPS which are 240v, 15a. All computers are plugged in to those outlets.

Not my main job, but I participate in datacenter build out and sometimes get cast off old equipment. When I built my house, I put some of that stuff to good use.

Thanks for proving my point about reading comprehension. And also thanks for saying your house wiring isn't standard in so many words!
 
Appliances can draw 3,000 watts from a standard UK wall socket, and in Western Europe that goes up to almost 3,700 watts. However, with heavy duty cabling you can get 7,000 watts of juice to your UK PSU. After that we are talking 400 volt three phase, which you will have to share with your Tesla charger.

I'm looking forward to new innovations in PSU technology!
You might also be looking forward to new innovations in air conditioning technology to go along with them, because 3.7k Wh is ~1084.5 BTU and 7k Wh is ~2052 BTU. It will require air conditioning. Even in the dead of winter, you'd at least have to open a window, because it will be too much heat for one room.

Nobody is going to build a PC needing nearly 3kW, you'd melt for starters and even if you have AC - Can it take an additional 3kW of load in a single room?
A quick search shows a 6000 BTU window air conditioner uses about 1.65 kW.

Then there's the why, even if you could get your hands on 4x 5090s and managed to wire them up without burning your house down four times over, what are you going to do with them? SLI is dead so you can't use them for gaming so realistically we're talking rendering or AI.
AI. You could build a multi-GPU training machine, like this:

This PSU has approximately two real customers across the world.
In the past, it wasn't too uncommon for AI researchers to build multi-GPU workstations, which is by far the cheapest way to do lots of training. However, that's not going to work for big models, like LLMs. Those need a whole network of machines. That doesn't make the multi-GPU workstation completely irrelevant - just narrows it down to an even smaller set of people.
 
Last edited:
Thanks for proving my point about reading comprehension. And also thanks for saying your house wiring isn't standard in so many words!
Care to mention which of your points I'm failing to grasp? I'd love to get your viewpoint, but as you've said.. Im missing it.

All I'm saying is 240v isn't hard to come by in residential (being the default panel feed, and works over standard house wiring using 2 hots instead of 1 hot, 1 neutral), and is indeed the standard in datacenters where this PS is likely to be used.