Question 6700 XT vs 3070 RTX, raw power?

Aug 5, 2021
40
0
30
0
Last edited:

yeetbucket

Proper
Jan 14, 2021
169
31
120
1
FSR will never make up for the performance slice of turning on RT, because AMD cards are not designed for that. AMD Cards are designed in mind of rasterization, and that is that for the 6000 series.
 
Aug 5, 2021
40
0
30
0
3070 is more powerful and DLSS is miles ahead of FSR
I read multiple technical sources saying 6700 XT raw performance power / ability is actually around RTX 3080, so once games will be AMD optimized (consoles all AMD, so most games sooner may), we will see 6700XT at 3070-3080 zone, what your thoughts on this?
 
Aug 5, 2021
40
0
30
0
As I understood from a PC engineer/dev, RT soon enough will be much easier to work with, and wont be Nvidia "best" anymore, as RT will be more software based and run on any GPU. Now RT is mostly developed for Nvidia RT cores. Also he said, once the AMD tweaks and FSR comes and shines into the games, we have all the reasons to see 6700 XT running at 3080 zone.

I also think, FSR now at 1.0 and already some games showing better quality than DLSS, like: https://www.reddit.com/r/nvidia/comments/otewu3 View: https://www.reddit.com/r/nvidia/comments/otewu3/dlss_vs_fsr_in_edge_of_eternity_what_a/

I know most not... but some... so FSR 1.5 - 2.0 might bring more quality, same as DLSS 2.0 did. Also FSR 1.0 still very early right? so who knows, with good optimization perhaps it can be pretty good, even before next version comes.
 

Howardohyea

Proper
May 13, 2021
208
48
120
2
DLSS 1.0 is pretty poop performance/image wise, plus it requires tons of computational power to process it first before being available for implementation
 

Fiorezy

Notable
Jul 3, 2020
378
85
890
27
As I mentioned in your other post, Nvidia's RT and DLSS are more matured than AMD's counterpart. Don't bother yourself with comparisons and enjoy your card while you can.
 

yeetbucket

Proper
Jan 14, 2021
169
31
120
1
I read multiple technical sources saying 6700 XT raw performance power / ability is actually around RTX 3080, so once games will be AMD optimized (consoles all AMD, so most games sooner may), we will see 6700XT at 3070-3080 zone, what your thoughts on this?
I don't think this will ever happen. 6700xt will never be in 3080 zone in RTX or even rasterization. the 3080 is miles ahead of the 6700xt in performance, and the 3070 is as well. You must have bought a 6700xt and you are trying to justify it by making it better than it seems.
Raw power wise you guys say 3070 and 6700 XT same?

If so, then lets say, once FSR goes full blown scale,

and especially when FSR 2.0 comes out, we can possibly see RT performance match

thanks to FSR?

View: https://www.youtube.com/watch?v=LMD2QNW_hyQ


One article I found:
RT performance won't match. FSR isn't that good. Even with DLSS on quality my 3080 gets 65-80 FPS in CP2077 and the 6700xt will never match that.
 
If you only use the shader cores, the RTX 3070 has more raw power than the 6700 XT. Though this is theoretical performance, but it's something you can calculate based on hardware specs alone and does not require software to measure.

In any case, the formula to calculate performance is: shader cores times clock speed times two. The times two at the end is because GPUs multiply and add at the same time. The figures for the GPUs in question are (using the default base clock speed):
  • RTX 3070: 5888 shaders * 1.500 GHz * 2 = 17.664 TFLOPS
  • 6700 XT: 2560 shaders * 2.321 GHz * 2 = 11.883 TFLOPS
Even without the Tensor cores and extra RT hardware (Navi 2 GPUs are lacking one of the components that NVIDIA's RT solution has), the 3070 handily beats out the 6700 XT in terms of raw, theoretical power.

So then you might ask, if the RTX 3070 is supposedly 50% more powerful than the 6700XT, how come it isn't 50% better in games? Honestly, I don't know the answer to that. It could be there's a hardware deficiency somewhere that won't let the GPU bare its fangs. It could be we don't have software that lets the hardware go full speed ahead (outside of 3D Mark, I haven't seen anything leverage mesh shaders and other features introduced with Turing that don't get a lot of press).

Note that the tables were flipped around a while back. NVIDIA was making GPUs that weren't as powerful in terms of raw performance as AMD GPUs, but the two were basically neck and neck.
 

ASK THE COMMUNITY