News Incoming RTX 5090 could need massive power stages for possible 600-watt power draw – Nvidia’s next flagship GPU rumored to sport copious amounts of...

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Technology companies have a big problem - at least on the consumer side. They can continue to improve on their products and explain very technically and accurately how they are improved. Eventually, the product matures to a point of diminishing returns, which challenges users to feel a meaningful difference in the real-world experience of using the product. Such things are nice and will always exist, but at some point, your $80,000 BMW is good enough and you're not actively looking to buy a $3,000,000 Bugatti.

We aren't really there yet though (although from a price standpoint, I agree with you). Games still don't really come close to "real life" yet. I saw an AI video of a Grand Theft Game made to look photo realistic and it was clear we have a ways to go. No one will be able to afford it though.

But buying an RTX 5090 isn't going to turn GTA into a photorealistic game, and by the time a GTA with that level of graphics is released, many cheaper graphics cards will have approaching 5090 performance at far less power. My 6700 XT significantly outperforms a Titan X, the first $1000 card (released ten years ago, so equivalent to $1300 today according to some).

Somebody buying a 5090 is likely only going to see improvements if they measure the benchmark numbers. There might be games only playable at 4K/Ultra/RT on a 5090, but these days turning down the details a few notches only makes a difference when examining static screenshots side by side. (Even the ray-tracing stuff is too often a minimal improvement.) I bust out Mankind Divided again recently, a 2016 game that I could play perfectly well with my old R9 290 with some of the details turned down. Now I'm playing it for the first time with everything turned to the max but I can't say I notice any improvement.

Then again, that doesn't make it a problem for the graphics cards companies if they can sell the things. An $80,000 BMW might be enough, but Bugatti still easily sold out of the Chiron.
 
But buying an RTX 5090 isn't going to turn GTA into a photorealistic game, and by the time a GTA with that level of graphics is released, many cheaper graphics cards will have approaching 5090 performance at far less power. My 6700 XT significantly outperforms a Titan X, the first $1000 card (released ten years ago, so equivalent to $1300 today according to some).

Somebody buying a 5090 is likely only going to see improvements if they measure the benchmark numbers. There might be games only playable at 4K/Ultra/RT on a 5090, but these days turning down the details a few notches only makes a difference when examining static screenshots side by side. (Even the ray-tracing stuff is too often a minimal improvement.) I bust out Mankind Divided again recently, a 2016 game that I could play perfectly well with my old R9 290 with some of the details turned down. Now I'm playing it for the first time with everything turned to the max but I can't say I notice any improvement.

Then again, that doesn't make it a problem for the graphics cards companies if they can sell the things. An $80,000 BMW might be enough, but Bugatti still easily sold out of the Chiron.
Yep. Agreed. Progression each iteration is definitely slowing down compared to prices which are not.
 
Yes here in New Zealand (and also Australia and Europe), we use a 240V wall standard (which is safer, since it's the amps that kill you). They usually put a 10A fuse on a wall socket (for a total draw of 2,400W), but you can slap a 20A fuse on it no worries (and many people do, if the fuse keeps popping on that loop) to give an eye watering 4,800W on a single wall socket),
10 A, 230V, 50 Hz in most of Europe. just FYI.
 
  • Like
Reactions: TJ Hooker
Yes here in New Zealand (and also Australia and Europe), we use a 240V wall standard (which is safer, since it's the amps that kill you). They usually put a 10A fuse on a wall socket (for a total draw of 2,400W), but you can slap a 20A fuse on it no worries (and many people do, if the fuse keeps popping on that loop) to give an eye watering 4,800W on a single wall socket),

Yes, because it's the amps that kill you. So if you have a low voltage outlet (e.g. 110 or 50V, you will need a large amount of amps (20A or 40A respectively) to provide the same ~2,000W power as a higher voltage outlet - 10A usually being enough for a 240V). Electric fences are usually 5,000V - perfectly safe since they run them at 0.1 amps.
Current is a function of voltage. All else being equal, 240V is more dangerous than 120V. The fact that the 240V mains might be on a 10A breaker rather than 15 or 20 doesn't really matter, because 10A is still way more than enough to kill you.

Electric fences generally won't hurt you because they generate voltage pulses that don't contain a ton of energy. Sort of like a repeating static shock (which can be in the range of 10s of thousands of volts). If it could actually source 0.1 A continuously at 5000V it'd likely be dangerous (although DC is generally a bit safer than 50/60 Hz AC, if I recall correctly).
 
Last edited: