News AMD Boasts About RDNA 3 Efficiency as RTX 40-Series Looms

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
I hope the hype from AMD is somewhat close to accurate. With rising energy costs for both running my gaming PC and A/C in summer, efficiency is my #1 consideration after price/performance. I have to turn down a lot of knobs to get my 1070 to hit 60 fps in new games at 3440x1440 UWQHD. I'd love to stay near that 150/170W range with either an NVIDIA 4xxx or RDNA3 GPU but with a lot more eye candy on than I can currently run.
 
I hope the hype from AMD is somewhat close to accurate. With rising energy costs for both running my gaming PC and A/C in summer, efficiency is my #1 consideration after price/performance. I have to turn down a lot of knobs to get my 1070 to hit 60 fps in new games at 3440x1440 UWQHD. I'd love to stay near that 150/170W range with either an NVIDIA 4xxx or RDNA3 GPU but with a lot more eye candy on than I can currently run.

Just go back to 1080p 60hz and use more power efficient gpu.
 
  • Like
Reactions: Why_Me and artk2219

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
I would like to have some of that whatever you are having plz! o_O
I'm honestly curious about your comment. What about hoping for better performance at 150/170W in the upcoming generations is unreasonable? I had been looking at the 6600XT (160W TBP) and 3060 (170W) for a 50-70% increase in framerates at 1440. Both of those cards would meet my needs but have been priced too high. Do you think the 4xxx series and RDNA 3 will be worse than those cards? People are undervolting 3070s to get in that range as well.
Efficiency (more eye candy or FPS per watt) increases each generation. I'm just looking to stay on the lower-powered part of the range.
 
  • Like
Reactions: artk2219
seems AMD will never realize that it's not hardware but software make them fall way behind NVIDIA....

Since when? AMD has made huge strides over the years. I've had 3 AMD GPU's since the 7970, RX580, 5700XT. If you include the APU's I've had a 2400G and 3400G. I have had zero driver/hardware issues. ZERO.

That's an old statement from 10 years ago about buggy drivers. While they aren't perfect, with isolated side cases, I honestly believe them to be no better nor worse than NVIDIA.
 
Last edited:

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
i mean just go back to the regular 1080p 60hz monitor. if 27 inch 1080p look awful then just get 24 or 21 inch one.
I "need" an ultrawide (upgraded from dual-head 1080p) for work. Most of my work involves having detailed engineering drawings on the left half of the screen with documents on the right. A large UWQHD is perfect for my workflow. And most of the games I play benefit more from more real-estate than FPS.
 
  • Like
Reactions: artk2219
I think the comment about software is related to DLSS. Turning it on gives Nvidia a huge advantage over AMD for similarly performing cards. I would also reference Ray Tracing, but that is partially custom cores and partially software/AI.

Thats what FSR and FSR 2+are for, they arent perfect and DLSS does somethings better but it is very much the same functionality, and in the case of FSR its not even vendor specific.

https://www.tomshardware.com/features/amd-fsr-vs-nvidia-dlss
 
Last edited:

TheOtherOne

Distinguished
Oct 19, 2013
242
86
18,670
I'm honestly curious about your comment. What about hoping for better performance at 150/170W in the upcoming generations is unreasonable? I had been looking at the 6600XT (160W TBP) and 3060 (170W) for a 50-70% increase in framerates at 1440. Both of those cards would meet my needs but have been priced too high. Do you think the 4xxx series and RDNA 3 will be worse than those cards? People are undervolting 3070s to get in that range as well.
Efficiency (more eye candy or FPS per watt) increases each generation. I'm just looking to stay on the lower-powered part of the range.
Comparing three generations old card 1070 with upcoming generation 40XX while hoping to keep similar TBP but also "with a lot more eye candy" but at the same time not too pricey.
Yup, I would really like to have some of that too! 😁
 

Ogotai

Reputable
Feb 2, 2021
398
247
5,060
Since when? AMD has made huge strides over the years. I've had 3 AMD GPU's since the 7970, RX580, 5700XT. If you include the APU's I've had a 2400G and 3400G. I have had zero driver/hardware issues. ZERO.

That's an old statement from 10 years ago about buggy drivers. While they aren't perfect, with isolated side cases, I honestly believe them to be no better nor worse than NVIDIA.

the 7970 i have still running just fine as an almost daily use card in my 2nd comp, the 2 AMD based notebooks i also have, while not used daily, also running just fine.
 
I think the comment about software is related to DLSS. Turning it on gives Nvidia a huge advantage over AMD for similarly performing cards. I would also reference Ray Tracing, but that is partially custom cores and partially software/AI.

I will give you that. If you are into RT, then NVIDIA is your choice.


The difference between AMD's FidelityFX Super Res and DLSS is significantly smaller than it used to be. As with all things high motion, this will not be of significant issue, unless there are things like shimmer.
 

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
Comparing three generations old card 1070 with upcoming generation 40XX while hoping to keep similar TBP but also "with a lot more eye candy" but at the same time not too pricey.
Yup, I would really like to have some of that too! 😁

I'm still confused but I guess I'll just stay that way. I don't think there's anything unusual in expecting an increase in performance for the same TDP after waiting 3 generations. It would be pretty shocking if a 150-170W 4xxx series card performed exactly the same as a 1070. That's all I'm saying - I hope there's a 150-170W card in this upcoming generation, and it sounds like AMD is going to be a better choice for people like me who are buying lower TDP cards.
 

lmcnabney

Prominent
Aug 5, 2022
192
190
760
I'm still confused but I guess I'll just stay that way.

You aren't confused at all. Power requirements for components go up and down.

For example AMD had their CPUs around 220W with the FX-9XXX series chips, but came back down into the 50-70W range with Zen.
Intel has done the same with Pentium IV and most of their i9 chips.

I do think that there isn't enough 'negative' associated with just cranking up power to meet the need for more performance. I have some hope that the mid-tier 4XXX / 7XXX series will offer a more rational power draw. The RTX40 so far has not shied away from demanding more power. It remains to be seen if AMD might aim for slightly less performance for lower power requirements.

When budgeting for a PC people really should factor the power cost (CPU+GPU+cost to cool the chips+cost to cool the house)