News GeForce RTX 4060 May Consume More Power Than a RTX 3070

That might be true. The quest for more performance is always hindered by power consumption, so one way to increase performance in the next generation is to throw more power into it. Worked for Intel, right? But it also means that we could "upgrade" to inferior-numbered cards to keep the power while increasing FPS.
 
  • Like
Reactions: artk2219
Plus, say goodbye to the idea that a computer with good performance can be compact or quiet.
No need to say goodbye, you just need to adjust your expectations.

Game developers will always put the bulk of their effort in making sure their games play and look good enough on the lowest common denominator (recommended spec) they want to officially support to avoid alienating a large chunk of their potential market. If gamers start rejecting overpriced power-hungry GPUs, there will be increased focus on maintaining "good performance" (the devs' opinion, may differ from yours) on older/lower-end hardware.
 
  • Like
Reactions: artk2219
Just run the AC for a bit longer, or lower its setting a bit. Electricity is cheap. /s

Some may start caring when the room temperature goes up 5C or so after about an hour of playtime.

Case design is going to have to change.
If the PC is still in the same room with the user, what would a new design change with the energy still being released in one's room?
 
  • Like
Reactions: artk2219
I suspect I'm in the minority, but I just won't buy a GPU that has high power consumption numbers. I'm very sensitive to noise and my computer room gets direct sunlight in the afternoon and evening, so I need to keep temperatures down as well as noise. Hopefully the rumors that AMD's upcoming GPUs will be efficient are true, because there's no way I'm buying anything like a 300+W GPU.
Worst case, I'll just undervolt, turn on frame-rate limits, or find some other way to run a power hungry GPU a little more efficiently.
 
Simply don’t buy it and don’t support it if you’re not fine with GPUs using over 400 or over 500W of power. If you ask me, 350W is already a lot, and now they don’t care anymore. They’ve crossed a line.
 
  • Like
Reactions: artk2219
A jump from 300W to 350W in a single generation could pose a serious problem to budget gamers who want to either build a new system or upgrade their system with a new GPU.

People are going to need to recalibrate where each performance category lies for this generation. From all indications, the 4060 is not going to be a card for "budget gamers." With the aim of having usable ray tracing from top to bottom of the product stack, there's no longer going to be a sub $200 entry level dGPU offering. If you're a budget gamers, an APU is fast enough now for 1080p gaming. The 4050 is likely going to replace the 3060 in the current stack with a price to match, but better performance. If you're looking for something cheaper than that, it's possible Nvidia may continue producing lower end Ampere GPU's to cater to "budget" users. Nvidia may also be perfectly fine letting consoles be the best option for budget gamers rather than torching their ASP with low end Ada GPU's.
 
This is not surprising. Power consumption of 4090 is very like to be same as 3090ti, if not more. So if your flagship card is already a 500-600w monster, it's reasonable to think that. Mid range card will be at least 200-300w range.

Gaming cards are power consumption is getting ridiculous now.
 
Nvidia may also be perfectly fine letting consoles be the best option for budget gamers rather than torching their ASP with low end Ada GPU's.
Forcing budget PC gamers off of the PC game market is a dangerous proposition: if you make the potential market smaller, it may end up not worth some studios' trouble and with costs needing to be recovered across a smaller potential buyer base, prices will go up some more. PC gaming could be on its way down a death spiral.
 
I would not consider 3070+ performance "Budget " performance.
With the cost of silicon wafers increasing it is not feasible to make RT enabled budget cards or SUB $200 dollar cards.
The die space for the GPU, RT/Shader/CUDA/encoders/decoders etc.... cores to make it enjoyable and memory requirements make it impossible at this time.
I expect more older refreshes for the budget market and maybe a limited 3030/40 once all of the defective dies have been sorted.
 
With the cost of silicon wafers increasing it is not feasible to make RT enabled budget cards or SUB $200 dollar cards.
If Nvidia wanted to make a sub-$200 entry-level RT GPU, it could by using the GA107 die to make RTX3050s instead of the GA106 which is twice as big. While it wouldn't set any RT performance record, it would still run circles around anything else previously available new anywhere near $200.
 
I would not consider 3070+ performance "Budget " performance.
With the cost of silicon wafers increasing it is not feasible to make RT enabled budget cards or SUB $200 dollar cards.
The die space for the GPU, RT/Shader/CUDA/encoders/decoders etc.... cores to make it enjoyable and memory requirements make it impossible at this time.
I expect more older refreshes for the budget market and maybe a limited 3030/40 once all of the defective dies have been sorted.
I don’t think RT will be special anymore with 4000 gen, and even almost all 3000 cards beside 3050 dealt with it easily (with DLSS of course and suitable res for the card). Every gen that passes RT will become more normal, and some time in future the talk about fully path traced graphics will start, but that’s still some good way to go. Current RT is a joke compared to “real” full screen path tracing.

edit: with not “special” anymore I meant, the performance impact will become less and less, depending on res, the high end cards of 4000 gen won’t even need DLSS anymore when DLSS is activated. 1440p that is, 4K could be on the edge without DLSS. A good thing for upscaler doubters and people who simply want more fps.
 
Last edited:
Every gen that passes RT will become more normal, and some time in future the talk about fully path traced graphics will start, but that’s still some good way to go. Current RT is a joke compared to “real” full screen path tracing.
You don't have to wait for the future, Minecraft RTX and Quake II RTX are fully path traced.

But if you want hardware that don't need to do any de-noising or whatever, yeah no. We'll probably be six feet under before we get something affordable that can do that. Even Pixar has to resort to baking in things to get render times per frame down and they're still on the order of "sometimes days per frame" (https://www.foundry.com/insights/film-tv/pixar-tackled-coco-complexity).
 
You don't have to wait for the future, Minecraft RTX and Quake II RTX are fully path traced.
lol, no thanks. I didn’t ask for ancient graphics pumped up with modern technology.
But if you want hardware that don't need to do any de-noising or whatever, yeah no. We'll probably be six feet under before we get something affordable that can do
Yeah, no, negative people have said similar things about technology in the past as well and were proven blatantly wrong with their predictions (hi Bill Gates), it’s better to not talk like this about the future. We will see, but I’m cautiously optimistic.

Aside from that, I didn’t say anything about denoising, if it works it works, if it has subpar quality, perfection will come later.

If Ray Tracing isn’t the best way for perfect graphics it will be replaced. Nobody said it is “the” way, it’s just the best graphics we have today and nothing more is certain.
 
If Ray Tracing isn’t the best way for perfect graphics it will be replaced. Nobody said it is “the” way, it’s just the best graphics we have today and nothing more is certain.
A bunch of new game engines offer alternatives to RT that look just as good or sometimes better without the need for any RT-specific hardware to achieve good frame rates. In all likelihood, simulated RT will improve so much faster than raw RT compute power that full-blown RT will end up relegated to being a fallback for stuff developers haven't figured out a sufficiently convincing shortcut for yet instead of the primary render path.
 
  • Like
Reactions: thisisaname