News RTX 4060 Ti Beats RX 7600 In Early Generic Benchmarks

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Most of 4060Ti sold will be 8 GB.

Is 4070 with more VRAM coming as well?

There's rumors about it but nothings being confirmed. I don't know why the VRAMs on all these cards are so lopside. It should be like this:

4090 - 24GB
4080 - 16GB
4070/4070Ti - 12GB
4060/4060Ti - 8GB
4050/4050Ti - 6GB

The VRAM size should correspond with the size of the core. Everything is lopsided. AMD got it right this gen and last gen. NVidia was way wrong last gen lol, and even before. It looks like its a bit better this gen, besides a 16GB 4060Ti, like WTF..... Lol
 
I wouldnt be buying ( if i was buying budget ) a 7600 or a 4060 .. i have the 6700xt and its a great card at its current prices why bother with this gen low end rubbish ..
Ive also got a Intel ARC 770 which with 16gb and the work intel keep doing with the drivers is a super good purchase !!

Latest driver update completely killed my HDMI audio, had to revert back to the first DXVK driver, give it a try with the next release I suppose.
 
To be fair there's videos proving that AMD drivers aren't as bad as people make them out to be or at the very least there aren't way more AMD drivers vs Nvidia
NOT my personal experience and AMD having less driver releases is not a GOOD thing. It shows how behind they are on the software side compared to nvidia. Look how they abandon (temporarily) the 6000 and below series for a couple months end of last year into this when the 7000 series dropped and was the only series to get like a handful of drivers. I have issues with AMD software all the time, it is not there strong suit so its not just their drivers I just don't depend on the CPU software due to its horrid bugginess.


is it not spreading false facts about AMD ?
No its not. Their performance cannot compete on the high end (IE 4090 level, fact),FSR IS inferior with plenty of testing proving as much, RT implementation is still behind Nvidia (but I applaud the big gains with 7000 but I do RT so), AMD drivers take years to reach full potential of their GPUS (ie fine wine or as I call it release weak drivers and fixing them to where they should have been at launch but two to three years later). But mroe importantly, statisicaly speaking there are more issues with AMD drivers having biggger issues than Nvidia and when both are having smaller issues, AMDs are less likely to have a user work around than a Nvidia driver. So I think drivers are worse than you realize or your ok because your one of the lucky few with no issues.

View: https://www.youtube.com/watch?v=O5B_dqi_Syc


View: https://www.youtube.com/watch?v=4YAZn7Og4yo

100% they need to start matching the higher end Nvidia GPU's in both raw 90series and RT performance and i think next gen after the success of the 4090 AMD will and should take a huge swing at the 5090 !!

i honestly think they can scare Nvidia with a bunch of RT cores scattered around the MCM design 3d v cache and Gddr7 ..
If it can compete with the 5090 a 1700usd price tag wont look out of place as i dont see Nvidia's next flag ship coming any cheaper than the 1600 1700usd price tag !!
This I am 100% behind. I don't want to be stuck on Nvidia. My gripes aren't some simping for Nvidia or trolling AMD. I use to buy AMD GPUs regularly but was burned to many times with bad drivers, performance not matching up on a rig (5000 series was boosting badly or barely at all) I built for a nephew. So you say the 6000/7000 series is better. Great, not all the facts back you up on this but I am happy to see them doing better. I would love to go AMD for a GPU if they can get these issues under wrap but until then or until Nvidia blows it on the software side and hardware side, no thank you.
 
Last edited:
When only two of the Nvidia cards offer a significant boost from last gen, why buy any of them?


40 series just sold features, its performance was more or less same as last gen. The message is, if you want best, hand over the money for a 4090. And no lowering prices, just reduce number of cards made so you don't have to sell them for less.

People expect X amount of increase but the companies don't want to give it to you anymore. There has to come a point where card speeds plateau and you start being sold little features that don't really make a big difference.

AMD message less clear since they only released 2 cards so far. Lots of speculation. All I know is the 7800/7700 won't beat the 7900, rest is unsure.
 
NOT my personal experience and AMD having less driver releases is not a GOOD thing. It shows how behind they are on the software side compared to nvidia. Look how they abandon (temporarily) the 6000 and below series for a couple months end of last year into this when the 7000 series dropped and was the only series to get like a handful of drivers. I have issues with AMD software all the time, it is not there strong suit so its not just their drivers I just don't depend on the CPU software due to its horrid bugginess.



No its not. Their performance cannot compete on the high end (IE 4090 level, fact),FSR IS inferior with plenty of testing proving as much, RT implementation is still behind Nvidia (but I applaud the big gains with 7000 but I do RT so), AMD drivers take years to reach full potential of their GPUS (ie fine wine or as I call it release weak drivers and fixing them to where they should have been at launch but two to three years later). But mroe importantly, statisicaly speaking there are more issues with AMD drivers having biggger issues than Nvidia and when both are having smaller issues, AMDs are less likely to have a user work around than a Nvidia driver. So I think drivers are worse than you realize or your ok because your one of the lucky few with no issues.

View: https://www.youtube.com/watch?v=O5B_dqi_Syc


View: https://www.youtube.com/watch?v=4YAZn7Og4yo


This I am 100% behind. I don't want to be stuck on Nvidia. My gripes aren't some simping for Nvidia or trolling AMD. I use to buy AMD GPUs regularly but was burned to many times with bad drivers, performance not matching up on a rig (5000 series was boosting badly or barely at all) I built for a nephew. So you say the 6000/7000 series is better. Great, not all the facts back you up on this but I am happy to see them doing better. I would love to go AMD for a GPU if they can get these issues under wrap but until then or until Nvidia blows it on the software side and hardware side, no thank you.
FSR2 vs DLSS2 has always been tied to game implementation and preference. Outside of the actual artifacting in games, quality-wise, they're more or less the same and have the same overall artifacting problems. Tim even says as much in the video: "it'll come down to your preference". AMD offers sharper image quality and nVidia is a blurry mess at times, but hey AMD is worse.

On the driver front: GeForce Experience sucks, there's no way anyone can argue in favour of it without puking. AMD's core driver package is feature rich compared to anything nVidia has put out in the last 20 years (!) and has been for a good while. nVidia doesn't even have the decency of providing a core GPU monitoring app as part of the drivers and people has to risk downloading 3rd party apps all the time. As for AMD's problems with the drivers... They are just always a confirmation bias as for nVidia's is just "ah, minor bug, doesn't matter" and people sweeps them under a rug all the time when they often forget they've burned GPUs and even the 3080s had power issues at the beginning of their life cycle because the drivers were boosting them too much; never mind monitor compatibility as well. Sure AMD has somewhat annoying problems, but they're nowhere near as catastrophic as nVidia's historically. Black screens and crashes are annoying and I won't make any apologies for it, but come on. You give the benefit of the doubt to the company that has burned GPUs instead of the one that hasn't? From memory, the only glaring problem AMD has had in the recent years was with the RX480 and the PCIe power going above 75W.

Also, why does AMD need to release a driver every month? Is there a practical reason to put that arbitrary cadence? I don't need Audio drivers every month? I don't need network driver updates every month? I still don't understand why people went monkey nuts when they took a few extra cycles to release updates for RNDA2 and prior when they were resolving the RDNA3 release issues. Like... Why...?

Anyway, I always look with disdain these stupid comparisons which say "AMD bad, nVidia good" with blanket statements. Same the other way around, mind you.

Yes, the 4090 is the GPU top dog, but that's about it. All other software enhancements from nVidia are locked down things which affect consumers in the long run, but people seem to like getting tied down to a single vendor for some reason...

Regards.
 
Yes, the 4090 is the GPU top dog, but that's about it. All other software enhancements from nVidia are locked down things which affect consumers in the long run, but people seem to like getting tied down to a single vendor for some reason...
What do you mean by that?
 
What do you mean by that?
DLSS as a piece of technology is only usable by nVidia, like everything nVidia does.

People buying into proprietary software technologies is bad in the long run, but it has always been hard to sway popular opinion/mentality about it.

They box themselves into a solution which makes them depend on it and when they want out, they can't. I know this is over-dramatizing it, but since XeSS is not really catching on, it leaves AMD's FSR as the only competition for it pushing it further. Then you have GSync, PhysX, CUDA and many other nVidia-exclusive things which have created alienation of the market and a hard dependency on nVidia hardware. Then they complain about nVidia raising prices unilaterally while still supporting their practices... Facepalm.

Regards.
 
DLSS as a piece of technology is only usable by nVidia, like everything nVidia does.

People buying into proprietary software technologies is bad in the long run, but it has always been hard to sway popular opinion/mentality about it.
I believe that DLSS 3 is based on some hardware AI engines (and possibly other stuff), it is not just a software. I believe it is a pretty complex thing and also very expensive to develop. I see no reason why nVidia should not reap benefits for what they developed.

BTW I have a 4070 card and I like this technology a lot, including frame generation.
 
I believe that DLSS 3 is based on some hardware AI engines (and possibly other stuff), it is not just a software. I believe it is a pretty complex thing and also very expensive to develop. I see no reason why nVidia should not reap benefits for what they developed.

BTW I have a 4070 card and I like this technology a lot, including frame generation.
It is always a conflicting principle based on peoeple's perception of what "good" is. An intrinsic economic principle, without going that way, which is always a cause of divide.

Without going on that tangent, imagine if AMD didn't have access to X86 or Intel didn't have access to X86-64. Or if ARM stopped licencing the ARM ISA and decided to do their own thing. Or if Microsoft decided to start making GPUs in-house and closed DirectX since they now have XBox.

Having "open" things sometimes is for the better end (or benefit) of the consumer and there's a very fine divide between what is actually good, desirable and flat out bad. And by "open" I don't mean "open source" as people often confuse the two. Java is a great example of this. Java as a programming language is not "open source", but "open for use". Anyone can develop software using it without paying Oracle a cent. If nVidia allowed AMD and Intel use DLSS, I don't think anyone would be upset about it. Same with CUDA and other techs they've built, but keep under a lock for anyone else. In fact, nVidia has gone as far as removing support for AMD and Intel via their driver packages on certain elements of CUDA, PhysX and something else I forgot about. AMD's and Intel's hardware are capable of running DLSS, no matter what nVidia's marketing want you to think. Keep that in mind.

Regards.
 
Also, why does AMD need to release a driver every month? Is there a practical reason to put that arbitrary cadence? I don't need Audio drivers every month? I don't need network driver updates every month? I still don't understand why people went monkey nuts when they took a few extra cycles to release updates for RNDA2 and prior when they were resolving the RDNA3 release issues. Like... Why...?
in reality they release about 3 a month, but mostly to fix little things in games. Its Nvidia that only release one. That might also be as they recently released new cards.

AMD drivers are fine provided you don't have an old card, then you get the legacy one size fits all driver. TBH I can't say Nvidia approach much better as you can't tell me the drivers written for a 40 series card really help anyone with a 10 series card. Eventually you best just stop on a driver that works... that applies to most hardware actually.
 
NOT my personal experience and AMD having less driver releases is not a GOOD thing. It shows how behind they are on the software side compared to nvidia. Look how they abandon (temporarily) the 6000 and below series for a couple months end of last year into this when the 7000 series dropped and was the only series to get like a handful of drivers. I have issues with AMD software all the time, it is not there strong suit so its not just their drivers I just don't depend on the CPU software due to its horrid bugginess.



No its not. Their performance cannot compete on the high end (IE 4090 level, fact),FSR IS inferior with plenty of testing proving as much, RT implementation is still behind Nvidia (but I applaud the big gains with 7000 but I do RT so), AMD drivers take years to reach full potential of their GPUS (ie fine wine or as I call it release weak drivers and fixing them to where they should have been at launch but two to three years later). But mroe importantly, statisicaly speaking there are more issues with AMD drivers having biggger issues than Nvidia and when both are having smaller issues, AMDs are less likely to have a user work around than a Nvidia driver. So I think drivers are worse than you realize or your ok because your one of the lucky few with no issues.

View: https://www.youtube.com/watch?v=O5B_dqi_Syc


View: https://www.youtube.com/watch?v=4YAZn7Og4yo


This I am 100% behind. I don't want to be stuck on Nvidia. My gripes aren't some simping for Nvidia or trolling AMD. I use to buy AMD GPUs regularly but was burned to many times with bad drivers, performance not matching up on a rig (5000 series was boosting badly or barely at all) I built for a nephew. So you say the 6000/7000 series is better. Great, not all the facts back you up on this but I am happy to see them doing better. I would love to go AMD for a GPU if they can get these issues under wrap but until then or until Nvidia blows it on the software side and hardware side, no thank you.
the 6000 series matched , beat or seriously came close every Nvidia card last gen ..
Unless you want to count RT performance and it which case yeah okay but be honest the 4090 is the first card since the 20 series where RT has been viable without upscaling or dlss ..
And only now RT is at its all time high adoption rate since its release !!
the 6900xt was slightly behind the 3090 the 6950xt beat it!!
want me to start on the 3 3080s of which the 6800xt 16 gb is still crushing .
The AMD 12gb and 16gb cards of last gen are still viable options for some Nvidia's rubbish Vram cards this gen !!

FSR is not better than DLSS but lets also be honest here is DLSS open source no so FSR is breathing life in to old Nvidia card like the 10 series which still alot of people use !!

Not sure you can use the 5000 series as a excuse 100% the early 5000 drivers were terrible or so ive read the 5700xt was a cheaper 2070 competitor everyone knows that !!
they were not in the ball park of the 2080 let alone the super or ti ..

I take my hat off to the 4090 100% excellent card !!
But everything else is over price junk from Nvidia and if it wasnt for the good will from die hard fanboys who usually scare people with AMD drivers and RT features that at best 20% of gamers use Nvidia would be losing market share at an alarming rate !!

I understand your distaste for AMD drivers and if youre experience has been BAD well it is what it is but to generalize that all AMD drivers are bad is simply not true !!
Because i can 100% assure you if AMD drivers were as bad to me as you say i would have thrown my cards out the window and bought Nvidia ..
like my distaste for gigabyte mobos ive had 2 issues ( only small issues ) the network adaptor on my gigabyte b550 itx and the gigabyte z590i vision d front m.2 doesnt work with the 10th gen cpu but will work with 11th gen 2 very small issue but i will now avoid buying gigabyte boards !!
 
Last edited:
AMD's and Intel's hardware are capable of running DLSS, no matter what nVidia's marketing want you to think. Keep that in mind.
You would need a very deep knowledge about how this technology works and what each card from these different manufacturers can run (and how well it is optimised to run it) to be able to say this.

I am afraid that it is highly unlikely that you have this knowledge.


Having "open" things sometimes is for the better end (or benefit) of the consumer and there's a very fine divide between what is actually good, desirable and flat out bad.
One could argue that what seems in the short run not to benefit consumers much, benefits them in the long run, because it protects the existence of the companies and competition on the market.

Myself as a consumer see no problem at all with two selfish companies sharing little with each other competing with each other, trying to outdo the opponent and win a consumer purchase by bringing them something better than the opponent.
 
Last edited:
You would need a very deep knowledge about how this technology works and what each card from these different manufacturers can run (and how well it is optimised to run it) to be able to say this.

I am afraid that it is highly unlikely that you have this knowledge.
100% this and even if its true less than 1% would know how to implement it on a AMD gpu
 
One could argue that what seems in the short run not to benefit consumers much, benefits them in the long run, because it protects the existence of the companies and competition on the market.

Myself as a consumer see no problem at all with two selfish companies sharing little with each other competing with each other, trying to outdo the opponent and win a consumer purchase by bringing them something better than the opponent.
Look at what ARM openly (again, not "open sourcing") licencing their ISA did for the smartphone market. That is also why nVidia was blocked from acquiring ARM: everyone knew the writing on the wall was "kick everyone out of the ARM ecosystem".

Do you honestly believe having only AMD and Intel accessing the X86 license is good instead of having Qualcomm, Apple (yes, Apple), Mediatek, VIA or even Texas Instrument being able to tap into it? And that's just the handful of Companies that came to mind, but there's plenty other semiconductor-based Companies that could compete in the market like in the past. VIA used to make X86 CPUs as well, but had to bail out and I can't remember the exact reason why.

History has proven time and time again that having the willingness to licence tech to others will always benefit the market and consumers, but it won't, necessarily, benefit the shareholders/owners of a Company. Paying royalties for things is not alien and very well known, so I just don't see why nVidia can't let AMD use CUDA or DLSS. Same with Intel. Each has to re-invent the wheel in order to compete instead of agreeing on improving around the same tech. nVidia could very well hold the patents and license it to both AMD and Intel with agreeable terms and everyone would benefit, including nVidia, but they just want the cake and eat it whole because they can and people says it's fine when in the long run it's detrimental for them. Again, big facepalm.

Regards.
 
.... Paying royalties for things is not alien and very well known, so I just don't see why nVidia can't let AMD use CUDA or DLSS. Same with Intel. Each has to re-invent the wheel in order to compete instead of agreeing on improving around the same tech.
Progress is based on inventing new things.
Using other peoples stuff is the lazy approach that hinders progress.

Re-inventing the wheel? The best thing ever!!!
 
Last edited:
Progress is based on inventing new things.
Using other peoples stuff is the lazy approach than hinders progress.

Re-inventing the wheel? The best thing ever!!!
There's a thing called "improvement". Incremental changes or even drastic ones within the same concept is also re-inventing the wheel, but in a different light.

Imagine if USB or ATX didn't exist for the industry. How many adapters would you have to carry around? There used to be plenty specs out there that have been deprecated thanks to a lot of people feeding back into a single entity that takes that and builds something new or improves on it.

I'll stop here; I think my point was well made a few comments before and it seems* from here on out is just nitpicking.

Regards.
 
You always have this constant barrage of people telling others to "buy AMD", "now is the time to buy AMD".

Then you step away from the fanboys for a minute and look at everything yourself.

-AMD performance that is often worse
-very poor AMD raytracing performance
-FSR that looks worse than DLSS
-no Nvenc
-no CUDA
-AMD driver issues
-higher power consumption on AMD cards

Sure I'll buy an AMD GPU, if you give me a 40% discount compared to Nvidia, to compensate for all these issues.

But AMD wants to sell GPU by diving just 2% or 3% under Nvidia's price. No thanks.
I own both an RX 6800 in one machine and a RTX 3080 10GB in another so I'm saying all this from long-term personal experience.

- AMD's GPU drivers haven't been a major problem since literally the RX 5000/RDNA 1 days.

- The quality gap between FSR 2 & DLSS 2 when both are well implemented while very much a real thing, isn't major and is shrinking all the time (FSR 2's new versions have been more frequent & impactful upgrades vs DLSS 2's ever since Nvidia switched their dev focus to DLSS 3).

- RDNA 2 already SIGNIFICANTLY reduced the encoding quality gap w/ VCE vs Nvidia's NVENC, and RDNA 3 appears to have damn near closed said gap entirely.

- And higher power consumption??? What in the freaking hell are you smoking? 😵 RDNA 2 pulled less power than Ampere basically ACROSS THE BOARD, while RDNA 3 pulls slightly more power vs Ada Lovelace for slightly better performance, while using a slightly older node! (Ala RTX 4080 vs RX 7900 XTX. & 4nm AL vs 5nm w/ 6nm MCD's RDNA 3).

Now lacking CUDA support otoh IS a very legitimate complaint, but one that doesn't apply to the VAST MAJORITY of PC gamers. 🤷

And we just gonna ignore AMD giving you an ACTUALLY SUITABLE AMOUNT OF VRAM??? The 12GB $700+ BEFORE TAX RTX 4070 Ti is an absolute freaking joke!!!! Only an idiot buys one over a notably faster & VASTLY more future-proof 20GB RX 7900 XT for like +$50 more.

Now as far as the actual article topic is concerned, the RX 7600 is literally going to be $100-$200 CHEAPER than the RTX 4060 Ti ($299 vs $399-$499). So of freaking COURSE the significantly more expensive card is gonna be faster! 🤦
 
  • Like
Reactions: Phaaze88 and -Fran-
the 6000 series matched , beat or seriously came close every Nvidia card last gen ..
Unless you want to count RT performance and it which case yeah okay but be honest the 4090 is the first card since the 20 series where RT has been viable without upscaling or dlss ..
And only now RT is at its all time high adoption rate since its release !!
the 6900xt was slightly behind the 3090 the 6950xt beat it!!
want me to start on the 3 3080s of which the 6800xt 16 gb is still crushing .
The AMD 12gb and 16gb cards of last gen are still viable options for some Nvidia's rubbish Vram cards this gen !!
While I applaud AMDs copious VRAM as it is needed (Listening Nvidia?) lets get real here on performance. The 6000 series was the first time AMD caught up to Nvidia in generations on the high end only to lose that race again the next generation. The time prior AMD was 'competitive' on the high end was three generations ago with the R 390 vs GTX 980 (being kind as it was well behind the GTX 980 TI). Vega couldn't keep up with the 1080 Ti but did due a better ish job, Polaris was mid range only and the 5700 series didn't come close to Nvidia's high end RTX 2080 Ti. So NO AMD was close last gen but with their current track record the last handful of generations, that is the exception not the rule. The RX 7900 XTX proved as much.

View: https://www.youtube.com/watch?v=nE_zW5SKPic

FSR is not better than DLSS but lets also be honest here is DLSS open source no so FSR is breathing life in to old Nvidia card like the 10 series which still alot of people use !!
While I agree FSR does breath new life into old cards the picture quality is horrid compared to DLSS IME. So while this is a point for FSR its also a point for DLSS because of its now superior picture quality. DLSS 1 was another story and in that case I would agree with you but its not the standard anymore for Nvidia.

I take my hat off to the 4090 100% excellent card !!
But everything else is over price junk from Nvidia and if it wasnt for the good will from die hard fanboys who usually scare people with AMD drivers and RT features that at best 20% of gamers use Nvidia would be losing market share at an alarming rate !!
Here we agree. The 4090 is a good card, over priced but good. And the rest of the RTX 4000 stack minus the 4060...is all way over priced. So we very much agree here. Nvidia needs better pricing and more VRAM, no argument.

I understand your distaste for AMD drivers and if youre experience has been BAD well it is what it is but to generalize that all AMD drivers are bad is simply not true !!
Because i can 100% assure you if AMD drivers were as bad to me as you say i would have thrown my cards out the window and bought Nvidia ..
Here is the tricky part. I have friends who have great experiences with AMD drivers and others that had nothing but problems. So this one is tricky in that from what I can tell from first/second hand experience/reading up on issues, system configuration can make a huge difference on system stability on AMD cards. Mostly down to AMD not testing their GPUs under enough hardware configurations. BUT this gets into the land of hard to prove...so I can see where your coming from BUT I also know my personal experiences, those of my gaming clan, etc. Regardless I am glad your experience has been positive and it gives me hope. I just don't have FAITH yet for all the reasons I have listed in this and prior posts.
FSR2 vs DLSS2 has always been tied to game implementation and preference. Outside of the actual artifacting in games, quality-wise, they're more or less the same and have the same overall artifacting problems. Tim even says as much in the video: "it'll come down to your preference". AMD offers sharper image quality and nVidia is a blurry mess at times, but hey AMD is worse.
Not totally unfair but my personal preference is DLSS. I have seen both FSR 1/2 and DLSS 1/2/3 (1 was horrid blurry your not wrong)...and generally speaking DLSS is a better experience for myself. But I admit I am EXTREMELY nitpicky about my image quality. I am very sensitive to clarity and aliasing in particular. FSR is getting better but it still feels at least one or two revisions away from catching up to DLSS for me personally. But it is great kit to have in a game all the same and I like it is open source. So I am not railing on AMD for fun but for features and image quality when upscaling, here at least.
GeForce Experience sucks, there's no way anyone can argue in favour of it without puking. AMD's core driver package is feature rich compared to anything nVidia has put out in the last 20 years (!) and has been for a good while. nVidia doesn't even have the decency of providing a core GPU monitoring app as part of the drivers and people has to risk downloading 3rd party apps all the time. As for AMD's problems with the drivers... They are just always a confirmation bias as for nVidia's is just "ah, minor bug, doesn't matter" and people sweeps them under a rug all the time when they often forget they've burned GPUs and even the 3080s had power issues at the beginning of their life cycle because the drivers were boosting them too much; never mind monitor compatibility as well. Sure AMD has somewhat annoying problems, but they're nowhere near as catastrophic as nVidia's historically. Black screens and crashes are annoying and I won't make any apologies for it, but come on. You give the benefit of the doubt to the company that has burned GPUs instead of the one that hasn't? From memory, the only glaring problem AMD has had in the recent years was with the RX480 and the PCIe power going above 75W.
IDK where to start here but I'll do my best. First off where we agree. Yes AMD has some features Nvidia should like GPU monitoring, the 3080 debacle was real but you fairly called out the RX 480 for causing system damage due to bad drivers as well. Always fair Fran, I truly appreciate that. Makes debating you so much harder than anyone else on toms save a few mods, harder because like you, I also try to be fair as I can which means owning the points I missed/didn't know or can't dispute fairly.

I am not happy with Nvidia market position and do wish AMD was doing better than they are! Back on point though your black screens, crashes, etc are annoying on AMD hardware but also very hit or miss I mentioned above on it frequently being related to your system config. I have not had any of the Nvidia 'problem' driver issues personally so it may have rose tinted my glasses but not out of bias but personal experience. At the end of the day I can only go by my experiences and what I read/hear. So far my experience on AMD GPUs has been sub par. Right wrong or otherwise...this is what I have experienced.

The geforce experience is OK. But OK for the top GPU dog is far from acceptable. So again credit given when due.

Wait didn't I disagree on something? Oh yes I do. My big issue with AMD drivers is the now old addage 'ages like fine wine'. It ages that way because AMD does not extract the full performance of their cards for years unlike Nvidia who seems to get near 100% power within months of launch. Personally I prefer the latter but I do have friends who prefer AMD 'fine wine'. Their augment being they were happy with performance at launch and extra FPS down the road is just a bonus. I don't agree but I do get the mindset. I want my full performance up front but if your happy waiting, more power to you.
Also, why does AMD need to release a driver every month?
Not every month no but it shouldn't be more than eight weeks with game releases is to long. And/or do game day drivers like Nvidia. I'd say longer waits are fine because if it ain't broke why fix it...but there is ALWAYS still a bug in something so it ALWAYS needs fixing on both sides. Which brings me back to AMD not releasing drivers as regular as Nvidia is not a net positive for AMD (be it for mind-share, 'benefits' or imaginary otherwise).
Anyway, I always look with disdain these stupid comparisons which say "AMD bad, nVidia good" with blanket statements. Same the other way around, mind you.
I agree 100%. I Run an AMD CPU I just don't have enough trust ATM to run an AMD GPU for the reasons I have mentioned. But I agree that fanboying a side blindly is pointless and counterproductive. Not what I am trying to do here. I WANT AMD to do better, they just aren't IMHO. Is it as bad I see it? It likely depends...I am only pointing out the problems I had and see others still have. which brings me to your last statement I back up with another 100% :

Yes, the 4090 is the GPU top dog, but that's about it. All other software enhancements from nVidia are locked down things which affect consumers in the long run, but people seem to like getting tied down to a single vendor for some reason...

Getting locked into one vendor is not a good thing I agree. I WANT to move on from Nvidia in the same way I moved on from Intel this build. I am only looking for a GPU I can trust and that can compete in the high end. (typically an 80 class guy but went 4090 due to the insanity that was this gen). Be it AMD, Intel or a new player. I PRAY for some real competition again. I miss being torn between GPU vendors. But sometime during the middle to the end of the GCN era, AMD had become lack luster. I was happy to see the 6000 series do so well and I was hopeful at first for the rx 7900xtx but it fell short because AMD insisted on matching Nvidia's insane pricing. Had it been a 799 msrp card I might have taken a chance and given their drivers yet another shot. OR if AMD had a card that competed with the 4090 directly and had RT that exceeded the 3090Ti but was behind the 4090 for a cheaper price...or equivalent price if raster/RT matched..again I may have made that move.

AMD doesn't have the mind-share Nvidia does for a host of reasons. Some fair some not. But when you lack mind-share you need to compensate in price, AMD did not do this well enough this gen (shame on Nvidia for setting the pricing so high and shame on AMD for playing along). If you don't, you end up with users like me who reluctantly went Nvidia because I felt I had little other choice. We need a strong AMD in the GPU space. They are making strides even if they missteped this gen on performance and price. I do hope the RX 8000 series is insanely awesome with sweet FSR 3 (or 4) that shames DLSS, able to beat Nvidia in raster and RT...Give me a stable driver base too and I am sold for no other reason than I am TIRED of giving Nvidia my cash as they sure seem eager to ask for more and more cash per mm2 and its getting disgusting if I am being frank.
 
Last edited:
Wait didn't I disagree on something? Oh yes I do. My big issue with AMD drivers is the now old addage 'ages like fine wine'. It ages that way because AMD does not extract the full performance of their cards for years unlike Nvidia who seems to get near 100% power within months of launch. Personally I prefer the latter but I do have friends who prefer AMD 'fine wine'. Their augment being they were happy with performance at launch and extra FPS down the road is just a bonus. I don't agree but I do get the mindset. I want my full performance up front but if your happy waiting, more power to you.
As we're not disagreeing on fundamentals in the other points, allow me to address this one: the "fine wine" effect.

This is a two-fold issue/thing AMD has had for a very long time thanks to both its own incompetence to stand up to nVidia and, wel, nVidia being nVidia.

Remember "The Way It's Meant To Be Played"? Or, for short, how nVidia sponsors a big chunk of games most of the time? The one usually playing catch up on the performance front is AMD because most games ship with pre-compiled shaders for nVidia and most do not re-compile them for AMD and then AMD has to include those in the driver and override them later on. Also similar-ish for specific ways of implementing code for those shaders. This is one of the biggest contributors of "fine wine" on their part. Look at AMD sponsored titles and how AMD just pulls ahead, with nVidia almost never catching up: Borderlands 3, Assassin's Creed and a few others. If you look at some of those, it took nVidia almost as much time, if not even more, to catch up to AMD on some of those games. Look at CoD's Modern Warfare thing used in the charts of many comparisons. It appears that nVidia is still not able to catch up or just gave up.

The other, simpler, side is just bug fixes. AMD does take a bit to come with bug fixes and nVidia does have a valid fair advantage there as they always seem to have a shorter backlog of stuff to fix, which translates to a shorter turn-around (turn over?). Be it because of engaging with so many developers before launch or because 90% of the big developers actually use nVidia to test and develop their games. Only exception being console side, I guess. Whatever the case/cause, it is a reality and I've accepted it as such.

Regards.
 
  • Like
Reactions: atomicWAR
As we're not disagreeing on fundamentals in the other points, allow me to address this one: the "fine wine" effect.

This is a two-fold issue/thing AMD has had for a very long time thanks to both its own incompetence to stand up to nVidia and, wel, nVidia being nVidia.

Remember "The Way It's Meant To Be Played"? Or, for short, how nVidia sponsors a big chunk of games most of the time? The one usually playing catch up on the performance front is AMD because most games ship with pre-compiled shaders for nVidia and most do not re-compile them for AMD and then AMD has to include those in the driver and override them later on. Also similar-ish for specific ways of implementing code for those shaders. This is one of the biggest contributors of "fine wine" on their part. Look at AMD sponsored titles and how AMD just pulls ahead, with nVidia almost never catching up: Borderlands 3, Assassin's Creed and a few others. If you look at some of those, it took nVidia almost as much time, if not even more, to catch up to AMD on some of those games. Look at CoD's Modern Warfare thing used in the charts of many comparisons. It appears that nVidia is still not able to catch up or just gave up.

The other, simpler, side is just bug fixes. AMD does take a bit to come with bug fixes and nVidia does have a valid fair advantage there as they always seem to have a shorter backlog of stuff to fix, which translates to a shorter turn-around (turn over?). Be it because of engaging with so many developers before launch or because 90% of the big developers actually use nVidia to test and develop their games. Only exception being console side, I guess. Whatever the case/cause, it is a reality and I've accepted it as such.

Regards.
Very aware of the "The Way It's Meant To Ber Played"....which should be illegal but that's another story, be it closely related. Sorry I wasn't trying to skim over things to make Nvidia look good, my post was already very long was all lol. They're a problem no question. I think we agree on a lot actually. COD (and other titles) making a point I agree with and knew about. But the fact AMD in so many consoles they should have some leg up in game optimization and shaders as well. Point being AMD should be doing better than it is in PC still. But potato, potahto. I see where your coming from even if we don't line up 100%.
 
Last edited: