News GeForce RTX 4070 vs Radeon RX 6950 XT: Which GPU Is Better?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Thunder64

Distinguished
Mar 8, 2016
114
161
18,760
nVidia don't need marketing they completely dominate the dGPU market. Greater than 80% market share.

Games now have Ray Tracing in their top settings. This will mean AMD gets decimated performance wise and its AMD's own fault. There is no longer any logical arguement for treating RT as anything other than normal.

Only a very few people got AMD cards and fewer still are AMD fans. The whole market is nVidia's. Thus DLSS and RT is the new normal. Why does FSR run on all cards because it has to run on nVidia gpu's or its dead on arrival.

On steam hardware surey gpus.
AMD Radeon RX 6700 XT 0.53%
AMD Radeon RX 6600 0.49%
AMD Radeon RX 6600 XT 0.40%
AMD Radeon RX 6800 XT 0.24%
AMD Radeon RX 6900 XT 0.20%

RTX 3060 4.66%
RTX 3060 Laptop GPU 4.51%
RTX 2060 4.45%
RTX 3060 Ti 3.13%
RTX 3070 2.95%
RTX 3050 2.71%
RTX 3080 1.94%
RTX 2070 SUPER 1.45%
RTX 3070 Ti 1.37%
and on and on...

Is there anything more rare than a gamer with a AMD RX 6000 series gpu. People voted with their money for nVidia. They voted RT/DLSS, a massive landslide majority. Players don't have a focus on raster or this wouldn't happen. AMD lost, RT matters. Its the new normal. nVidia have all the market share and no real competition.

AMD are so wiped out by RT/DLSS Intel is their direct competition in the dGPU market. It's not nVidia vs AMD. Its AMD vs Intel. nVidia won, AMD cannot compete.

Why don't you let real tomshardware's customers post like me and post this crap on AMD's forums. nVidia completely won. The hotest feature is ray tracing. The standard for AAA games. Who has the best performance in a world of Ray Tracing, nVidia. Who has the best upscaling nVidia. Who has > 80% market share, nVidia. Who has the closest market share to AMD, Intel.

Who controls if a card has enough VRAM, nVidia. Who controls the features a card should have, nVidia.


Nvidia still crushing the data center market


Stop talking as if what AMD does matters in the dGPU market.

AMD’s Graphics Card Market Share Gets Cut in Half as NVIDIA Accounts for ~90% of GPUs Sold



The focus on raster performance destroyed AMD, so did ignoring AI.

So by your definition the P4 was great because it had 80% of the market despite getting killed by the Athlon 64 in just about every way. People buy what they know, or what their friends have. Doesn't mean it is better. It takes time for "mind share" (a phrase I hate) to change. Nvidia is doing a good job at losing that with high prices, misnamed cards, and planned obsolesnce by using lower VRAM. If AMD didn't drop the ball with RDNA 3, things could be very different right now.
 
Last edited:

zx128k

Reputable
Is it clear now why Nvidia is asking $600 for the 4070? There are some out there eager to buy it based solely on marketing
The market wants better Ray Tracing performance. Not reviews based on raster performance. That and a lack of a DLSS 3 feature is harming AMD. nVidia is seen as better on Ray Tracing and upscaling. The market went with nVidia, not AMD. Thus the performance of ray tracing on both cards is important and not less than raster.

Most AAA games now have ray tracing in their top settings. You can't ignore that to make raster performance the focus any longer. News websites have to follow the customer. Reviews 100% have to. The market went with RT/DXR and Upscaling. This was the most important features and why AMD's market share is so bad. Why the RX 6000 series really has a low market share. No matter how much reviews pushed AMD products as being close to nVidia's top of the line in raster performance. Gamers still bought nVidia.

You lot are still pushing raster performance as a primary focus for performance and no one cares. At least when it comes to buying a dGPU. This is why nVidia is winning.
 

Ogotai

Reputable
Feb 2, 2021
327
221
5,060
The market wants better Ray Tracing performance
agreed, BUT unless you have a, x80 or x90 series card, RT any any thing else sucks, and kills performace too much, hell even a co worker with a 4080 turns rt off in the games he plays, cause the performance hit, just too much for his liking.


You lot are still pushing raster performance as a primary focus for performance and no one cares.
cause that is what some people need/want more then RT/dlss. but you sure dont seem to understand that.

you sure seem to be pushing RT and dlss, as that is ALL nvidia has to sell their cards over the radeons. IF someone could care less about those 2 things, then its, pretty much a wash. i dont care about RT, nor do i care about my card HAVING to create FAKE frames ( aka DLSS frame generation ) to keep my FPS up. at this point, there is a good chance i would get a radeon over a gforce because of this, and the fact. geforce is quite a but more expensive then Radeon. im not paying for 2 features i wont use.

oofdragon is also correct, people keep paying nvidias prices, cause of marketing, and mind share.

TBH, zx128k, this sounds like the SAME argument you were trying to make, and IMO failed at, with the " get a console " BS you kept pushing.
 
  • Like
Reactions: oofdragon
Nvidia's dominant share of the gaming-chip market increased 700 basis points in March to 83% of users, B of A analyst says, citing Steam data that has a caveat.6 Apr 2023 source
Just for the record, the March Steam Hardware Survey was borked. We don't know why, Valve never explains things (or at least rarely explains them), but the April data (i.e. posted in early May) shows a major correction so that the numbers are mostly in line with the February data.

That said, while I do understand Nvidia controls the GPU market to a large degree, I do not think that's a good thing. I think DLSS and DXR are useful and interesting technologies. I really, REALLY wish Nvidia would stop with proprietary stuff, however. Now, DLSS basically requires tensor cores, so I get that. DXR isn't closed source, but I do think SER, DMM, OMM from Ada Architecture are extensions rather than base DXR features.

But CUDA has been good for Nvidia and overall bad for the market. This is why the DOE (Department of Energy) has commissioned exaflops supercomputers with:
AMD Zen 3 + AMD CDNA (Frontier)
AMD Zen 4 + AMD CDNA2 (El Capitan)
Intel Sapphire Rapids + Intel Ponte Vecchio (Aurora)

There are big government forces actively pushing to shift away from Nvidia, and that speaks a lot to Nvidia's business practices — in both the consumer and data center markets.
 

Ogotai

Reputable
Feb 2, 2021
327
221
5,060
There are big government forces actively pushing to shift away from Nvidia, and that speaks a lot to Nvidia's business practices — in both the consumer and data center markets.
going by this, evga did have a reason to drop nvidia then ? arent there other companies that wont deal with nvidia because of this as well ?
 

zx128k

Reputable
agreed, BUT unless you have a, x80 or x90 series card, RT any any thing else sucks, and kills performace too much, hell even a co worker with a 4080 turns rt off in the games he plays, cause the performance hit, just too much for his liking.



cause that is what some people need/want more then RT/dlss. but you sure dont seem to understand that.

you sure seem to be pushing RT and dlss, as that is ALL nvidia has to sell their cards over the radeons. IF someone could care less about those 2 things, then its, pretty much a wash. i dont care about RT, nor do i care about my card HAVING to create FAKE frames ( aka DLSS frame generation ) to keep my FPS up. at this point, there is a good chance i would get a radeon over a gforce because of this, and the fact. geforce is quite a but more expensive then Radeon. im not paying for 2 features i wont use.

oofdragon is also correct, people keep paying nvidias prices, cause of marketing, and mind share.

TBH, zx128k, this sounds like the SAME argument you were trying to make, and IMO failed at, with the " get a console " BS you kept pushing.
I played RT on my RTX 2060 for years. Performance was fine. You can do the same on a RTX 3060 but not a AMD RX 6600. So this is only ture for AMD.

The some people that don't want or need RT are a very small part of the market. So small they can basically be ignored which is what tomshardware should do.

I avoided the March Steam Hardware Survey gpu results.

If you want games based on raster, get a console. Raster games and Lumen games don't need a AMD RX 7900xtx or a AMD RX 6950xt. You can get a whole console cheaper.

One company dominating the dGPU market can't be good for prices.
 

Ogotai

Reputable
Feb 2, 2021
327
221
5,060
I played RT on my RTX 2060 for years. Performance was fine.
your idea of " performance was fine " can be different then some one else's. what games were you playing and at what frame rates ?

The some people that don't want or need RT are a very small part of the market
oh ? who says ? you ? nvidias profits due to over pricing ? market share ? lets see a source for this claim.



If you want games based on raster, get a console
and as i said in the other thread about this : read this post here

and that point in that post still stands. a console, regarless if one only cares about raster performance, is a waste of money depending on their use case, the money would be better spent on a new vidcard, or other upgrade on their comp, which you sure dont seem to under stand, at all
One company dominating the dGPU market can't be good for prices.
but yet, you are praising that one company that IS dominating the market, for its RT and dlss features. which not every one cares about. go figure
 

zx128k

Reputable
your idea of " performance was fine " can be different then some one else's. what games were you playing and at what frame rates ?


oh ? who says ? you ? nvidias profits due to over pricing ? market share ? lets see a source for this claim.




and as i said in the other thread about this : read this post here

and that point in that post still stands. a console, regarless if one only cares about raster performance, is a waste of money depending on their use case, the money would be better spent on a new vidcard, or other upgrade on their comp, which you sure dont seem to under stand, at all

but yet, you are praising that one company that IS dominating the market, for its RT and dlss features. which not every one cares about. go figure
source for this claim
Already posted. The market voted RT/DLSS. AMD's market share halfed. AMD no longer have a meaningful share of the dGPU market.

This is what you get when you don't listen to customers and make up arguements to cover poor performance/innovation. No one believes you, they spent their money on a nVidia card.
 

Ogotai

Reputable
Feb 2, 2021
327
221
5,060
Already posted. The market voted RT/DLSS. AMD's market share halfed. AMD no longer have a meaningful share of the dGPU market.
sorry that kind proves squat. it only proves either nvidia marketing dept is doing its job, or mind share wins.

and once again, you ignored my view on why " get a console " is BS comment. which it is, and for some a waste of money.
 

zx128k

Reputable
sorry that kind proves squat. it only proves either nvidia marketing dept is doing its job, or mind share wins.

and once again, you ignored my view on why " get a console " is BS comment. which it is, and for some a waste of money.
AMD basically has a tiny market share. nVidia doesn't complete with anyone. In capitalism thats a total victory. You don't want a company like AMD to go out of business because government will step in. nVidia are happy for AMD to pretend they are completing. Meanwhile nVidia can charge what they like. They are the market leader.

AMD won't go out of business because of consoles and CPUs. At the moment they are like 3dfx.
 

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
Was checking Fortnite Ray Tracing... the 7900XTX is better than the 4070 Ti at Ray Tracing, but about equal. AMD cards are not far from Nvidia when this gimmick /useless feature is turned on. The 6950XT is also about equal with the 4070, but a little bit lower. The 7900XTX is running about 10 to 15 % faster than the 4070 Ti and the 6950 XT about 5% to 10% lower than the 4070. At such a low difference they are equal, you cannot tell which is which in real gameplay, no one can. When DLSS quality is turned on 7900XTX keeps the lead, people have to realize that titles who favors Nvidia will do so as has allways been and that has nothing to do with AMD cards being slower. Jedi Survivor came as example that AMD is better than Nvidia with ray tracing turned on.. because it's a game sponsored by AMD. It's really fraudulent and a lie though to call the 4070 better at Ray Tracing than the 6950XT when they really match in most games as TechPowerUP found out, Guru3D, Anand Tech and so on. 6950XT is 30% faster than the 4070 at raster and about equal in RT and DLSS/FSR, that's the truth, every major review site says so. This article is wrong, biased, bs, shame. shaaaame
 

zx128k

Reputable
God tomshardware really buthurt the AMD trolls.

He picks Jedi Survivor but doesn't state the first patch addressed some performance issues and increased performance by upto 75% on all top cards. Talk about cherry picking and a completely broken game performance wise.


12143235678l.jpg


We can see in a vendor netural benchmark of Ray Tracing a 7900xtx is close to a RTX 3080TI which is faster. As you can see the 4090 is massively ahead of the AMD cards, the 4090 is twice as fast as the closest AMD card the 7900xtx.

If I was to cherry pick a game to show case AMD poor performance then it would be this one.



For anyone watching AMD and nVidia both implement DXR or ray tracing approx. same way. It just the RX 6000 series processes the BVH hits via the shaders/TMU and nVidia proccess all the BVH at once via dedicated hardware. This makes nVidia able to proccess more rays than AMD which means more performance. AMD have been a full generation behind in performance from the start. Where AMD cards really die performance wise is with path tracing. With path tracing its all down to hardware performance and software wont help you. The only way to be faster is by processing more rays per frame. This is were a RX 6900xtx will get 2fps and its all poor hardware performance. Why in Portal a RTX AMD 6000 series cards cant perform well.
 
Last edited:

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
God tomshardware really buthurt the AMD trolls.




12143235678l.jpg


We can see in a vendor netural benchmark of Ray Tracing a 7900xtx is close to a RTX 3080TI which is faster. As you can see the 4090 is massively ahead of the AMD cards, the 4090 is twice as fast as the closest AMD card the 7900xtx.

Bro are you blind? I picked Fortnite RTX, and theeenn mentioned Jedi for being a game that favors AMD but if you think that's a broken example how about Farcry 6? I liked your graph, it actually does show the 6950XT matches 3080 (so 4070) in RT, but it kind of a stretch to place the 7900xtx alongside it since it's a much faster card. Portal was made by Nvidia to sell Nvidia DLSS, it's not a benchmark game but a crap. Minecraft same, to sell DLSS. Cyberpunk almost same, heavily $pon$pred. All other games, 6950XT matches 4070 in RT, and I doubt the 7900XTX doesn't match the 4070Ti... on this gimmick useless feature.
 

zx128k

Reputable
Bro are you blind? I picked Fortnite RTX, and theeenn mentioned Jedi for being a game that favors AMD but if you think that's a broken example how about Farcry 6? I liked your graph, it actually does show the 6950XT matches 3080 (so 4070) in RT, but it kind of a stretch to place the 7900xtx alongside it since it's a much faster card. Portal was made by Nvidia to sell Nvidia DLSS, it's not a benchmark game but a crap. Minecraft same, to sell DLSS. Cyberpunk almost same, heavily $pon$pred. All other games, 6950XT matches 4070 in RT, and I doubt the 7900XTX doesn't match the 4070Ti... on this gimmick useless feature.
Dude just stop.
 

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
Here goes another example.. Hogwarts Legacy RT 4K Ultra.. 4070 Ti scores 27fps Native and 44fps DLSS Quality while 7900 XTX scores 22fps Native and 44fps FSR Quality. How about that champ?
 

zx128k

Reputable
Here goes another example.. Hogwarts Legacy RT 4K Ultra.. 4070 Ti scores 27fps Native and 44fps DLSS Quality while 7900 XTX scores 22fps Native and 44fps FSR Quality. How about that champ?
We all know about that game and its problems with performance. Just stop dude. You are chilling for AMD. This is why the market share for AMD is so low.
 

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
Here goes another example champ.. Plague Requiem 4K. 7900XTX scores 28fps RT on and 60fps RT off. Don't even mind AB this game on vs off fanboy. The 4070 Ti scores 28fps RT on vs 60fps RT off at DLSS Quality. There.. 7900XTX matches the 4070 Ti RT, like it or not, and it of course bests it at raster because it's above the 4080
 

zx128k

Reputable
Here goes another example champ.. Plague Requiem 4K. 7900XTX scores 28fps RT on and 60fps RT off. Don't even mind AB this game on vs off fanboy. The 4070 Ti scores 28fps RT on vs 60fps RT off at DLSS Quality. There.. 7900XTX matches the 4070 Ti RT, like it or not, and it of course bests it at raster because it's above the 4080
What has that got to do with this thread, nothing. Give it a rest. Its not anyone job to rip apart all your sophistry arguements.
 

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
Le
We all know about that game and its problems with performance. Just stop dude. You are chilling for AMD. This is why the market share for AMD is so low.

I have bad news for you.. there are no "performance problems" in this game but lack of VRAM on Nvidia cards and RT sucks in ANY GAME
 

zx128k

Reputable
Le

I have bad news for you.. there are no "performance problems" in this game but lack of VRAM on Nvidia cards and RT sucks in ANY GAME
Prove there is a VRAM problem on nVidia cards. Use at minimum 40-50 games. Show that the games use stock settings and have no mods. Half the games must have DXR and use that setting at maximum settings. All games that have DLSS, must use it on nVidia cards. Only the quality setting.

If you want to lie, I will force you to waste your time trying to prove that lie.
 

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
What has that got to do with this thread, nothing. Give it a rest.

I can fetch some 10 more games for you. My point is.. 6950XT RT is on pair with 4070 RT, it is. And, 6950XT raster is on pair with 4070Ti, it is. The whole point of people batching in the comments is that you can prove what I just said by checking reviews online. I follow JENSN on YouTube where I got those 7900XTX vs 4070Ti numbers, but he does not benchmark the 6950XT and this card is hard to find on YouTube to AB. When I say 6950XT RT matches 4070 RT is based on TechPowerUP, Guru3D, LinusTechTips and other reviews I saw on YouTube, there is no denying about those numbers. This is not about NVIDIA vs AMD, it's about a bs article stating 4070>6950XT based on FAKE benchmark numbers. If the reviewer did not get paid for this he got a defective 6950XT and should redo the whole thing. That's what it is about
 

zx128k

Reputable
I can fetch some 10 more games for you. My point is.. 6950XT RT is on pair with 4070 RT, it is. And, 6950XT raster is on pair with 4070Ti, it is. The whole point of people batching in the comments is that you can prove what I just said by checking reviews online. I follow JENSN on YouTube where I got those 7900XTX vs 4070Ti numbers, but he does not benchmark the 6950XT and this card is hard to find on YouTube to AB. When I say 6950XT RT matches 4070 RT is based on TechPowerUP, Guru3D, LinusTechTips and other reviews I saw on YouTube, there is no denying about those numbers. This is not about NVIDIA vs AMD, it's about a bs article stating 4070>6950XT based on FAKE benchmark numbers. If the reviewer did not get paid for this he got a defective 6950XT and should redo the whole thing. That's what it is about
No 40-50 games. Stops you cherry picking. Most gamers have 6-8GB of VRAM . They are most of the market on steam hardware survey GPU. You made the statement, you prove it with using a correct sample size.

Remember to use the figure that shows the real VRAM being used. Show all the settings the games uses.
 

oofdragon

Honorable
Oct 14, 2017
242
237
10,960
Prove there is a VRAM problem on nVidia cards. Use at minimum 40-50 games. Show that the games use stock settings and have no mods. Half the games must have DXR and use that setting at maximum settings. All games that have DLSS, must use it on nVidia cards. Only the quality setting.

If you want to lie, I will force you to waste your time trying to prove that lie.

I said there are VRAM limitations on this game.. it's there even when RT is off.l, 1080p. Other VRAM shortcomings are present on last of us, re4, and other newer games coming out this year. Older games , which are about 99,9% of the games on the planet right now, are ok with 8GB and so they are without RT and DLSS. The only games where you NEED a Nvidia card to play are Portal RTX, Minecraft RTX and Cyberpunk with RT on.. the first two done by Nvidia themselves and the third heavily sponsored by then. Most of the games are console ports that run on AMD hardware, 99% of games will run just fine on AMD cards even when RT is on if you really care about that. $600 Nvidia vs $600 AMD gives, today, the same RT performance on both and there's no denying that, while the 6950xt is about 30% faster when RT is off and that's where 99% of people gaming is so it's NONSENSE to give a performance win to 4070, it's insane, it's a total shame
 

zx128k

Reputable
I said there are VRAM limitations on this game.. it's there even when RT is off.l, 1080p. Other VRAM shortcomings are present on last of us, re4, and other newer games coming out this year. Older games , which are about 99,9% of the games on the planet right now, are ok with 8GB and so they are without RT and DLSS. The only games where you NEED a Nvidia card to play are Portal RTX, Minecraft RTX and Cyberpunk with RT on.. the first two done by Nvidia themselves and the third heavily sponsored by then. Most of the games are console ports that run on AMD hardware, 99% of games will run just fine on AMD cards even when RT is on if you really care about that. $600 Nvidia vs $600 AMD gives, today, the same RT performance on both and there's no denying that, while the 6950xt is about 30% faster when RT is off and that's where 99% of people gaming is so it's NONSENSE to give a performance win to 4070, it's insane, it's a total shame
Please provide the requested evidence.

I did a random google search, this website doesn't agree there is a VRAM problem. Please account for their conclusion.

How Much VRAM Do I Need For Gaming?​

Answer:
In 2023, 4 GB of dedicated VRAM should be the bare minimum to aim for in graphics cards.
However, 8 GB is now the standard for most GPUs and that’s what you should aim for if you want a future-proof graphics card and/or if you intend on getting a 1440p or 4K monitor.

Conclusion​

All in all, 4 GB is the bare minimum for gaming in 1080p in 2023, while 6-8 GB should be the goal for most people who want to run games in 1440p or in 4K, or just those who want something more future-proof.

However, these are just generalizations. The GPU’s processing power is much more important when selecting your ideal graphics card.