News Incoming RTX 5090 could need massive power stages for possible 600-watt power draw – Nvidia’s next flagship GPU rumored to sport copious amounts of...

WHEW! This thing is going to cost more than my next gaming PC, and draw nearly as much power.

Not being the type who needs to get the latest AAA titles the instant they come out, but plays games that are less demanding, this won't be for me, but there will be a market, however niche it might be 😉
 
Technology companies have a big problem - at least on the consumer side. They can continue to improve on their products and explain very technically and accurately how they are improved. Eventually, the product matures to a point of diminishing returns, which challenges users to feel a meaningful difference in the real-world experience of using the product. Such things are nice and will always exist, but at some point, your $80,000 BMW is good enough and you're not actively looking to buy a $3,000,000 Bugatti.
 
Technology companies have a big problem - at least on the consumer side. They can continue to improve on their products and explain very technically and accurately how they are improved. Eventually, the product matures to a point of diminishing returns, which challenges users to feel a meaningful difference in the real-world experience of using the product. Such things are nice and will always exist, but at some point, your $80,000 BMW is good enough and you're not actively looking to buy a $3,000,000 Bugatti.
We aren't really there yet though (although from a price standpoint, I agree with you). Games still don't really come close to "real life" yet. I saw an AI video of a Grand Theft Game made to look photo realistic and it was clear we have a ways to go. No one will be able to afford it though.
 
wonder what Nvidias gonna do when gpu's take more than a NA outlet can supply.
That is easy. 220V outlet like you use for your dryer and oven. Now you will have to wire one in your office. "Poor" people will have a double extension cord that will allow you to plug into two separate outlets in two different rooms.
 
I sit back and look at this. I bought my "best" (so far) GPU, a Sapphire Nitro RX 6700, at standard RX 6700 MSRP, when MSRP was considered an amazing find.

And, did nothing for months. And haven't really gamed on it, though I ran a benchmark or two out of curiosity. My time available for gaming dropped off. Triple-V, my main machine, if you can call it that these days, sits there, not powered on for weeks at a time.

My daily driver, The Micro Machine, was doing my super-light gaming, MAME, and C64 emulation and that was with an Athlon 200GE + 8GB RAM at 3840x1600 resolution. I was mostly playing Don't Starve when logged into Steam, though, as I recall, I replayed Portal and Portal 2 with that hardware. With the upgrades to 16GB RAM and to a 3400GE, it handles more of the games I used to need a full-PC + discrete GPU for. Borderlands 1 Enhanced (at 1920x800) is probably the most taxing thing I've tried on it. I imagine Borderlands 2 would work well there also.

Though, as I'd mentioned in a thread about what games we're playing, Fallout 1, Pinball Arcade, and, through the C64 emulator, Zenji. Oh, and I dip into Berzerk on MAME every so often.

I'm trying to get my GF into a little bit of gaming as well. She absolutely LOVES watching when I play Don't Starve, as she finds the art style very endearing. I'm trying to get her to try out the original Portal, which works just fine on the iGPU of The Swirl, running a 2560x1080 resolution. Mostly trying to get her on this because Portal 2 (and I think Portal 1 also?) has some interesting two-player bits to try.

Oh, and, of course, when I fired up THIS bad boy, admittedly back in mid-2018 after it sat in a box disused for about a quarter century, I spent WAY more time on that particular game than I would've imagined.


Amazing graphics are awesome, don't get me wrong. But games can be HIGHLY enjoyable without the need to break the bank, or push the limits of technology.


EDIT: originally forgot link for The Micro Machine
 
Last edited:
  • Like
Reactions: husker
  • Like
Reactions: phenomiix6
WHEW! This thing is going to cost more than my next gaming PC, and draw nearly as much power.

Not being the type who needs to get the latest AAA titles the instant they come out, but plays games that are less demanding, this won't be for me, but there will be a market, however niche it might be 😉
at 600 watts it uses more than my current pc at full load lol.
 
standard outlet allows around 2200 w.. (in europe at least)
Yes here in New Zealand (and also Australia and Europe), we use a 240V wall standard (which is safer, since it's the amps that kill you). They usually put a 10A fuse on a wall socket (for a total draw of 2,400W), but you can slap a 20A fuse on it no worries (and many people do, if the fuse keeps popping on that loop) to give an eye watering 4,800W on a single wall socket),
 
IMO, the biggest problem today is the lack of optimization in games. The highest end GPU doesn't matter if the games aren't optimized accordingly. The PC release of Final Fantasy XVI is a recent prime example of this - there's no reason a 4090 should struggle to hit a consistent 60 FPS in that game, and yet it does.

The mid-range GPU of every generation, i.e. 3070, 4070 should, theoretically, be more than anyone should ever need to enjoy games at higher than baseline graphical quality / resolutions. It's absurd that we've gotten to a point where one needs to purchase a $2k+ GPU, only to have it struggle to hit what should be considered the baseline.

Game devs need to start doing their part again and go back to ensuring games are properly optimized for the masses and not just for those who can afford the highest end GPUs. If they can't have a game optimized by release time, they should delay release until this is so.
 
  • Like
Reactions: phenomiix6
IMO, the biggest problem today is the lack of optimization in games. The highest end GPU doesn't matter if the games aren't optimized accordingly. The PC release of Final Fantasy XVI is a recent prime example of this - there's no reason a 4090 should struggle to hit a consistent 60 FPS in that game, and yet it does.

The mid-range GPU of every generation, i.e. 3070, 4070 should, theoretically, be more than anyone should ever need to enjoy games at higher than baseline graphical quality / resolutions. It's absurd that we've gotten to a point where one needs to purchase a $2k+ GPU, only to have it struggle to hit what should be considered the baseline.

Game devs need to start doing their part again and go back to ensuring games are properly optimized for the masses and not just for those who can afford the highest end GPUs. If they can't have a game optimized by release time, they should delay release until this is so.
Not sure what you are talking about. I am currently 30 hours into FF16 with a 4090 and am averaging 80 to 100 FPS with all settings maxed out at 4K with HDR.
 
"Most" houses were not built in the last 20 years though.
Which isn't relevant to the demographic that is likely to be buying this product. The average 5090 owner isn't likely going to be living in a 40 year old mobile home with shoe laces for wiring. We're also not nearing the limit of 15A circuits just yet. There is time for more of the housing market to catch up.