News Nvidia's grasp of desktop GPU market balloons to 88% — AMD has just 12%, Intel negligible, says JPR

The sad part is that AMD cards are more efficient and run cooler, but until they fix their buggy software drivers no one will care.
Case in point: I once purchased an ATI 4890 as an upgrade for my aging 7800 GTX. The software never installed right as the GUI kept crashing; I had to install via command line. Oh, and the card outright died after 5 months of moderate use.

So yeah, that experience going away from NVIDIA as well as the continued reports of software problems (let alone no access to DLSS and the like) are reasons why I don't even consider an AMD GPU. Heck, it took until just last year for me to even consider one of their CPUs (due to Intel clearly being behind nowadays).
 
The sad part is that AMD cards are more efficient and run cooler, but until they fix their buggy software drivers no one will care.
Only a few AMD architectures have been more efficient than Nvidia's competing architectures in recent history, and temperatures are more about firmware and fan speed curves than anything. Basically, you should only look at performance and power use with temperature being a factor of the specific card(s) you're looking at rather than the architecture as a whole.

The RX 6000-series tended to use slightly less power than the RTX 30-series, but even RDNA 2 vs. Ampere wasn't always a win. Navi 33 GPUs for example tended to use more power proportionally compared with the higher tier Navi 32 and Navi 31 cards.

But at present? Ada currently blows AMD efficiency away, in terms of FPS/W. RTX 40-series GPUs are roughly 50% higher performance per watt than RDNA 3. GPU chiplets certainly didn't help AMD's efficiency use case, probably contributing at least 10-20 watts to power draw is my guess. It would have been interesting to see what RDNA 3 as a monolithic chip on TSMC N5 could have done, but that was not the goal.
 
D

Deleted member 2731765

Guest
I find the latest Nvidia ADA GPUs to run more efficient and cooler than my previous AMD cards, like the RX 480 GPU. Currently rocking the RTX 4060.

And that's a fact since the "Ada Lovelace" arch is FAR more efficient than AMD's RDNA 3.

Regarding driver support ?

To be honest, AMD used to have issues with its GPU drivers, particularly at launch, which hampered the performance. However, AMD’s software has improved significantly in the last few years, bolstered by frequent driver updates, resulting in much improved performance overall.

While both AMD and Nvidia update their drivers on a regular basis, due to higher demand for Nvidia’s GPUs, not to mention Nvidia’s deeper pockets, developers tend to favor Nvidia, offering much better support for those with an Nvidia graphics card.

That doesn't mean AMD cards and their driver support is bad. Nope. At least in my opinion, AMD has improved their GPU drivers. I never had any issue with my RX 480 card. But maybe that's just me.

I don't favor any camp though. I'm just stating the facts based on my own personal observation. Even INTEL is in a much better position now. Arc GPUs first arrived in 2022 and had plenty of early growing pains.

The latest drivers have come a long way, and the upcoming Battlemage "Xe2" GPU should also benefit from everything that came before, with improvements in the core architecture to remove bottleneck and limitations present in the first generation of Xe Graphics.

It's impressive to witness how much effort Intel's driver team has done in the past one year or so. Current Alchemist A-series ARC discrete GPUs have seen a pretty decent performance uplift in most of the latest AAA/AA games, if not all.

We just need more mainstream GPUs in a price bracket that a lot of gamers can afford, and not some highly expensive or a HALO product (thinks of RTX 5090) .
 
D

Deleted member 2731765

Guest
And still, the 4090 is the worst GPU ever made.

Can't comment on that, but I wasn't mentioning the 4090 card here specifically. I was just referring to Nvidia's "ADA Lovelace" architecture in general. I'm happy with my RTX 4060 card though.

Never faced any serious issues with it till now. Just because some of the flagship GPUs are plagued by issues, doesn't mean the entire 40-series lineup is in the same boat, IMO.
 
  • Like
Reactions: Avro Arrow

ekio

Reputable
Mar 24, 2021
106
129
4,760
AMD sells no card because they aligned on Nvidia’s pricing. Nobody is gonna prefer (except Linux users) to get a AMD for the same price per perf as a NV…

They should sell them literally half price, and there, it would become very attractive. Not the best drivers, not the best software ecosystem but a better hardware for a good price that would trigger a switch…

But it seems that they prefer to sell nothing with high margins rather than a lot with lower margins…
 
Case in point: I once purchased an ATI 4890 as an upgrade for my aging 7800 GTX. The software never installed right as the GUI kept crashing; I had to install via command line. Oh, and the card outright died after 5 months of moderate use.

So yeah, that experience going away from NVIDIA as well as the continued reports of software problems (let alone no access to DLSS and the like) are reasons why I don't even consider an AMD GPU. Heck, it took until just last year for me to even consider one of their CPUs (due to Intel clearly being behind nowadays).
I mean, you're talking about a graphics card from 16 years ago, things have more than slightly changed since then. I used to avoid MSI motherboards for a while since i had one explode on me 14 years ago, they used cheap cheap CHEAP VRM's and they didn't even cool them. So the motherboard couldn't handle a stock Phenom II X4 955 C3 revision, a chip it was rated for. I got over that and didn't mind giving them a shot again after a few years, and I've used several of them in builds with no issues since. I get why you would avoid something for a little while, but to avoid it for over a decade and a half because of one bad experience is something else. Everyone makes a bad product or software revision every now and again, the Geforce FX 5900 Ultra, GTX 480, and laptop GPU's and chipsets that would get so hot they desoldered themselves still existed. Nvidia has still made drivers that have bricked windows installs, and killed GPU's, no one is perfect. I've been using mostly AMD GPU's with builds for most of my family members over the last 4 or 5 years since they've done a good job on pricing, and I haven't had any issues. Did AMD have major driver issues in the past? Definitely, but that hasn't been the case for years now, almost a decade, but people still keep repeating the same story even though it isn't true.
 
Last edited:
AMD sells no card because they aligned on Nvidia’s pricing. Nobody is gonna prefer (except Linux users) to get a AMD for the same price per perf as a NV…

They should sell them literally half price, and there, it would become very attractive. Not the best drivers, not the best software ecosystem but a better hardware for a good price that would trigger a switch…

But it seems that they prefer to sell nothing with high margins rather than a lot with lower margins…
Yes and no, they still need to make money and some of these card are not cheap to make. Also there was a time (still is really) that you could get an RX 6600 (XT) for MUCH less than an RTX 3060, it didnt matter. The RTX 3060 still heavily outsold it even though the RX 6600 was the better deal. Nvidia has excellent mind share right now, and pricing alone isn't going to break that.
 

husker

Distinguished
Oct 2, 2009
1,234
235
19,670
Try to ignore the desperate tones of an AMD owner here: A couple of things to keep in mind are that this article is about desktop GPU market share. There are other non-desktop markets that help AMD. Also, market share and profits are generally connected, but not aways. Of course AMD wants more market, but their margins may be better.
 
  • Like
Reactions: artk2219

35below0

Respectable
Jan 3, 2024
1,727
743
2,090
And still, the 4090 is the worst GPU ever made. The amount of issues that SKU has is mind blowing.

Cracking PCBs, GPU solders failure, 12VHPWR...

I take an XTX over a 4080 or a 4090 everyday in the current gen... and that's without talking about pricing...
You mean the best GPU ever made? And not by a little. It's only a bad card if you ignore performance. :)

In the whole of history of history there has never been anything like the 4090. The connector failures and other quality issues are true though. There is some risk involved.
Also, Cities Skylines 2 is gonna chugg but that's just a universal law at this point.

I personally wouldn't go as far as the 4090 because i don't need that kind of performance. A 7800Xt or 7900GRE is probably the most anyone sane needs, but gaming and sanity never really met.
Some people want pure power, even if in the 40XX lineup, the 4070 Ti Super is already plenty of power and expensive enough.

And the "lowly" 4060 got ridiculed a lot but it's dangerously close to being competent for all but the most demanding games at 1440p Ultra+
¯\_(ツ)_/¯
 
  • Like
Reactions: artk2219

ThomasKinsley

Prominent
Oct 4, 2023
377
382
560
Despite being on TeamRed for as long as I could remember, three GPU failures made me switch. Poor fan design on two of them combined with trippy behavior on an OEM prebuilt that contributed to tanking the system soured me. IMHO, AMD never should've bought ATI. ATI was the golden era. Once AMD bought it, TeamRed lost their edge and never got it back.
 
  • Like
Reactions: gunish_d

Silas Sanchez

Proper
Feb 2, 2024
111
64
160
And still, the 4090 is the worst GPU ever made. The amount of issues that SKU has is mind blowing.

Cracking PCBs, GPU solders failure, 12VHPWR...

I take an XTX over a 4080 or a 4090 everyday in the current gen... and that's without talking about pricing...
I did a little more digging on this and found that one of the problems with the 12VHPWR plug is it actually loosens up all on its own. I have seen this with my system which has the 4080 super. Both ends of the plug (GPU and PSU) back out by ~0.5mm over time. This also happens on stock gpu cable and cable mod 90degree. So basically every few weeks to months I have to force the plugs back in. This is scary as the plug has so little insertion length already and means it wont take much to potentially heat up, it wont take much down the line like degradation to heat up. This is what people dont understand, just because a plug is made to a standard doesn't actually mean its safe, the standard tells you little about the actual quality.


This is it for me, not wasting my time with clowns who don't know anything about electrical engineering 101 basics. They (the corporate people) are undermining my life. NVIDIA are shameful and I simply wont touch another of their cards until I know very very well there is no stories on any bad going on.
 

abufrejoval

Reputable
Jun 19, 2020
519
367
5,260
And still, the 4090 is the worst GPU ever made. The amount of issues that SKU has is mind blowing.

Cracking PCBs, GPU solders failure, 12VHPWR...

I take an XTX over a 4080 or a 4090 everyday in the current gen... and that's without talking about pricing...
CUDA pays my GPUs these days, so there wasn't much choice.

What turned out rather mind-blowing was the progress DLSS brought to gaming.

I have both an RTX 4090 and an RTX 4070 side-by-side in two Ryzen 9 16 core systems, 4070 on a 5950X and 4090 now on a 7950X3D. Those CPUs are much closer in performance than I like at the moment, but what's been most impressive is how the 4070 has become rather competent at gaming on 4k.

My kids love ARK so that's what we play, Evolved first and Ascend now.

Evolved runs on a very early Unreal 4 engine and suffers from terrible load times (on Windows) with hundreds of thousands of files and is 100% raster-only.

Ascend runs on Unreal 5, collects its assets into a few large files and supports DLSS up to v3, but none of the Intel or AMD alternatives.

My screen is 4k at 42" below arm's length and the lower resolutions just look awful.

Both ARKs were unplayable on the 4070 initially and far from smooth on the 4090.

But after giving the 5950X some PBO attention I did another round with ARK Ascend on the 4070 at 4k, full EPIC effects but also with DLSS auto and frame generation and the result was rather impressive: there was simply no need to pay for the (GPU) moster to kill or tame them (in ARK): a 4070 with DLSS3 is just as good as a "GTX" 4090.

My first ATI was a Mach8, the very first graphics accelerator I owned. I stuck with them until the R9 290X and for some years ran the Nvidia equivalent in parallel for comparison. But once CUDA became important, AMD/ATI simply lacked work as a sponsor.

The only dGPU in my stables not from team green is an ARC 770m, which only made it into my home-lab because it came basically for free in an Intel NUC I bought as a CPU-only µ-server.

ARK Ascend is unplayable with ARC, and much of that may be lack of XeSS support, judging from some Hogwarts Legacy tests I've been doing, a game that support every tensor based acceleration variant currently on the market.

I love AMD for the Free-Sync impact on monitor prices and I appreciate Intel keeping keeping everyone on their toes with XeSS.

But they are no meaningful competition for my use cases today.

As for hardware issues, I just must have been lucky. Nothing cracked or burning, but I've been extra careful inserting cables, supporting and not transporting plugged boards.

The last real failure was a GTX780 from MSI, I believe, which was just factory overclocked a bit too much and never really worked.

Until I learned how to downclock it a notch, by which time it was too late to matter.
 
Looks mostly like clearing house, or in the case of Intel the house already being cleared, for forthcoming generations. Hopefully things do end up more interesting in the back half of the year, and the dream is that there's some actual competition across the market dragging that "mid range" performance back from $500+ to the $300-400 range.