The sad part is that AMD cards are more efficient and run cooler, but until they fix their buggy software drivers no one will care.
Nobody who has a clue has problems with Radeon drivers. I've used exclusively Radeon since 2008. Do you think that I would've done that if I encountered the driver problems that so many noobs cry about? Absolutely not.
Case in point: I once purchased an ATI 4890 as an upgrade for my aging 7800 GTX. The software never installed right as the GUI kept crashing; I had to install via command line. Oh, and the card outright died after 5 months of moderate use.
There's a flip side to that and it's the reason I
never overreact to getting a defective product.
Case in point: I once purchased an XFX Radeon HD 4870 1GB to replace my Palit 8500 GT 1GB. I was so enamoured with it that I bought a second one for Crossfire. I loved that so much that I replaced them with twin Gigabyte Radeon HD 7970s, also in Crossfire. I replaced those with a Sapphire R9 Fury (and added another just because I got it for like >$200 CAD). I replaced those with an XFX 5700 XT Triple-Dissipation that turned out to be defective. XFX replaced it with another defective card, much to my frustration and disappointment. Then XFX made good by upgrading me to a THICC-III model which has worked ever since. I then replaced that with an ATi RX 6800 XT OG Reference. In all that time, I have never had any major driver issues, nor have I ever been dissatisfied with the performance that my Radeons gave me.
I have become
so comfortable owning Radeons that I didn't bat an eyelid when I saw a great price on an (now my) ASRock RX 7900 XTX Phantom Gaming OC. I just bought it and the good times haven't stopped. I still have no real issues with the drivers and I'm happy as a clam.
Noobs always seem to the ones who have the worst time switching from one brand to the other. Fortunately for me, my noob years were between 1988 and 1992 (although my first card in 1988 was an ATi EGA Wonder).
So yeah, that experience going away from NVIDIA as well as the continued reports of software problems (let alone no access to DLSS and the like) are reasons why I don't even consider an AMD GPU. Heck, it took until just last year for me to even consider one of their CPUs (due to Intel clearly being behind nowadays).
Your narrow-minded view of things helps noone but hurts you. You've already paid way too much over the years which is bad enough but what's even worse is that your mindset is the one that has nVidia in their near-monopoly position.
It's quite clear that you never grew out of your noob phase because only noobs buy by brand. Real experts buy by specification and price. That's just how it is.
Those free 3090s and 4090s given to all influencers have provided some sweet results.
Of course. When the audience sees that's what the influencers are using... well, it's "monkey-see, monkey-do", eh?
Only a few AMD architectures have been more efficient than Nvidia's competing architectures in recent history, and temperatures are more about firmware and fan speed curves than anything. Basically, you should only look at performance and power use with temperature being a factor of the specific card(s) you're looking at rather than the architecture as a whole.
The RX 6000-series tended to use slightly less power than the RTX 30-series, but even RDNA 2 vs. Ampere wasn't always a win. Navi 33 GPUs for example tended to use more power proportionally compared with the higher tier Navi 32 and Navi 31 cards.
But at present? Ada currently blows AMD efficiency away, in terms of FPS/W. RTX 40-series GPUs are roughly 50% higher performance per watt than RDNA 3. GPU chiplets certainly didn't help AMD's efficiency use case, probably contributing at least 10-20 watts to power draw is my guess. It would have been interesting to see what RDNA 3 as a monolithic chip on TSMC N5 could have done, but that was not the goal.
And yet, some "techxperts" are still recommending Intel CPUs despite having an even bigger power use gulf than between Radeon and GeForce.
I brought some Radeon 6000 series graphics cards about 1.5 years ago when the price of these things are affordable. Let's face it, AMD and Nvidia are making hand over fist selling AI accelerators and has little incentive to make improvements of their graphics card lineup.
Yeah but if you just buy nVidia, one day there will be
only nVidia left and you'll
really get raked over Jensen's coals and it will be
far too late to do anything about it. It's kinda like the story of the grasshopper and the ant. The ant sees the big picture and the grasshopper is so self-centred that he can't see past his own nose. Red ants, green grasshoppers, there's a moral in there somewhere.
Can't comment on that, but I wasn't mentioning the 4090 card here specifically. I was just referring to Nvidia's "ADA Lovelace" architecture in general. I'm happy with my RTX 4060 card though.
Never faced any serious issues with it till now. Just because some of the flagship GPUs are plagued by issues, doesn't mean the entire 40-series lineup is in the same boat, IMO.
No, but since people look at the 4090's performance and assume that the
entire 40-series lineup is in the same boat, it's fair. Stupid is as stupid does and you can't fix stupid.
AMD sells no card because they aligned on Nvidia’s pricing. Nobody is gonna prefer (except Linux users) to get a AMD for the same price per perf as a NV…
^^^^ Facts Right There! ^^^^
They should sell them literally half price, and there, it would become very attractive. Not the best drivers, not the best software ecosystem but a better hardware for a good price that would trigger a switch…
It sounds good on paper but it wouldn't work because nVidia would just lower their prices to the point that AMD's pricing would make no difference. Remember, most people buy nVidia out of fear and ignorance which means that they wouldn't be swayed by a cheaper Radeon and AMD would just be kneecapping itself because it would have less money for R&D and wouldn't be able to keep up with nVidia.
But it seems that they prefer to sell nothing with high margins rather than a lot with lower margins…
It seems that way, but their margins would be far worse in your scenario because despite everything, there are still a lot of people who default to Radeon. I should know because I'm one of them: