News AMD Can't Beat Ada, So Brags About Old Ampere Comparisons

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I guess that depends on when you upgrade. I went from a Geforce 3 to a Radeon 9800 Pro that embarassed the Geforce FX that was "competeing" against it at the time. Radeon HD 4000/5000 series come to mind as well.

It's been so long I've forgotten what I had after the GeForce 4 Ti 4600... I remember that one because of the sick Wolfman demo that Nvidia put out. 🤣🤣🤣

Anyway... I don't live in the past and as said upthread... I went AMD processor with this build for the first time since that GeForce 3 system in 2001 because with the new AM5 platform AMD definitely has a processor that can run with the 13900k so that's the route I took for the better upgrade path... lower power draw and lower thermals.

Unfortunately the same can't be said for the GPU market. Nothing touches the 4090... so that's what I bought.

Hey AMD... if you make a better product... people will buy it.
 
  • Like
Reactions: sitehostplus
And we do.

Because most people don't care for spending halo-level money for the best-of-the-best, money-is-no-object card.

When cost is factored in, the 4090 is NOT the better product.

My 3Dmark scores that are 80% higher than my 3090 rig would say otherwise... but it's cool... we can agree to disagree.

It was all part of Nvidia's plan... make one 4000 series card that is a quantum leap in improvement over the previous gen flagship while making the rest of the lineup mediocre and the shareholders win.

I personally don't care either way... I just want the best hardware. If AMD made a better card performance wise I'd buy it. It's not rocket science.
 
Last edited by a moderator:
I personally don't care either way... I just want the best hardware. If AMD made a better card performance wise I'd buy it. It's not rocket science.
Conversely, after building PC's for more than 30 years as an enthusiast, using the best parts money can buy - now I just buy what I feel like. This time around, it's a couple of generations of AMD hardware (for the first time in my life), it simply doesn't matter that the performance isn't the same as a 4090 - I could buy a hundred 4090's and it wouldn't necessarily improve my gaming experience.

Like mobile phones, I used to have the best and most expensive - every year, but now I'm using dirt cheap Samsung models as it's not spoiling my mobile phone experience...

I hate to imagine how much money I'd have now if I had just half of what I'd spent, back.

Resale costs don't matter either, as I usually just give old stuff away.
 
Last edited by a moderator:
I could buy a hundred 4090's and it wouldn't necessarily improve my gaming experience.

Like mobile phones, I used to have the best and most expensive - every year, but now I'm using dirt cheap Samsung models as it's not spoiling my mobile phone experience...

Everyone's experience is different. Some guys gotta have the 144 fps or 240 fps or whatever it is nowadays... for me I'm about eye candy so I prefer the ultra 4K 60 fps. The only card that reliably does that is the 4090.

I used to get a new phone every year... but I haven't done that since the iPhone 5. Now it's every 2-3 years. Has nothing to do with cost... it's just phone hardware is so good now that unless you are a camera freak upgrading every year doesn't make sense.

I just went from the 6.1 inch iPhone 12 to the 14 Plus and 6.7 inch... my first taste of the big screen that came out like what? 4 years ago?

PC hardware I care more about keeping up to date than my phone.
 
Last edited by a moderator:
  • Like
Reactions: BeedooX
Riddle me this...

1. Why does nVidia have about an 80% market share in computer graphics?

2. Since 1998, almost every computer component has become stupid cheap in pricing, except for video cards, whose prices have tripled. Why is that?
1. AMD is a smaller company with a smaller capitalization and background. This is summed to a rather shy management strategy of low risk taking and mostly playing it safe. They never attempted to be the top dog to start with, basically. This does not means they do not sell products with superior price performance from time to time, since they do, but they just aim to sell a fixed stock amount and call it a day, not beat Nvidia in market share.

2. Due to the first point, Nvidia is a soft monopoly, specially with the branding cult it has formed in recent years mimicking a bit the Apple effect in some market. But the market has also intensified on its demand, wanting exponential more computing power for stuff like 4k or 200+ FPS, so the GPUs today need to have more budget invested on them to achieve the demands.

To further the 2nd point, entry level GPUs with a similar budget to older cards are still being made and released, but nobody wants to invest that little anymore in the desktop market as they can just grab a console, a tablet, or one of those cutesy mini-PCs or APU laptops those days instead.
 
Last edited:
When cost is factored in, the 4090 is NOT the better product.
You can't make value-judgments for other people! So many flame wars come down to this simple fact.

I just want the best hardware. If AMD made a better card performance wise I'd buy it. It's not rocket science.
It's funny... if you look at how many countries have a space program, it would lead one to conclude that building truly competitive GPUs is indeed harder!

With GPUs, there are so very many things you have to get right, in order to get into the ballpark on perf/$ and perf/W. The faltering, initial efforts of Intel & the Chinese GPU makers serve as reminders of this fact.
 
Riddle me this...

1. Why does nVidia have about an 80% market share in computer graphics?
I think the big turning point for Nvidia was around the time of Maxwell. That's when their Tegra program forced them to focus keenly on energy-efficiency. Maxwell was the first big beneficiary of that focus, with tiled-rendering being a specific example. They doubled-down on this, in Pascal.

Then, having built such a lead, they decided they could afford to shift their focus to new technologies, in Turing. So, they added Tensor cores and Raytracing. Now, AMD had not only a performance gap to close, but they also needed to invest in competing on those new feature sets.

RDNA was AMD's response to the efficiency problem, but it was a long time in coming. Pascal launched in 2016, RDNA came 3 years later. AMD is still playing catch-up on raytracing and deep learning accelerators.

In other markets, Nvidia did a brilliant job of jumping on the deep learning bandwagon, early. They made CUDA stable and supported it on virtually all of their hardware, which made it the natural platform of choice for deep learning researchers. AMD was also suffering, financially, right when Nvidia was making these crucial investments in CUDA and deep learning.

2. Since 1998, almost every computer component has become stupid cheap in pricing, except for video cards, whose prices have tripled. Why is that?
First, graphics is an "embarrassingly parallel" problem, which means you can keep throwing more silicon at it, and still get decent performance increases. If you plot performance per Transistor of GPUs, over time, it would be much more linear than with CPUs - that's for sure!

Also, I take issue with the assertion that PC components have really become that much cheaper. Increased integration helped reduce PC prices a lot. But, it seems like decent-performance CPUs never really got that cheap, and prices certainly have been on the rebound since the core-count race really got going.
 
You can't make value-judgments for other people! So many flame wars come down to this simple fact.

Amen.

I commented yesterday about the differences between AMD's top card and the 4090... the performance gap is pretty wide. Is that worth the difference in price? To me it is. I need a GPU that can do 4K ultra 60 fps without hiccups... and that's the 4090.

But according to the YouTube trolls I'm part of the problem for buying Nvidia's overpriced flagship card instead of pairing my 7950x3D with a 4060 Ti. 🤣
 
  • Like
Reactions: bit_user
I think the big turning point for Nvidia was around the time of Maxwell. That's when their Tegra program forced them to focus keenly on energy-efficiency. Maxwell was the first big beneficiary of that focus, with tiled-rendering being a specific example. They doubled-down on this, in Pascal.

Then, having built such a lead, they decided they could afford to shift their focus to new technologies, in Turing. So, they added Tensor cores and Raytracing. Now, AMD had not only a performance gap to close, but they also needed to invest in competing on those new feature sets.

RDNA was AMD's response to the efficiency problem, but it was a long time in coming. Pascal launched in 2016, RDNA came 3 years later. AMD is still playing catch-up on raytracing and deep learning accelerators.

In other markets, Nvidia did a brilliant job of jumping on the deep learning bandwagon, early. They made CUDA stable and supported it on virtually all of their hardware, which made it the natural platform of choice for deep learning researchers. AMD was also suffering, financially, right when Nvidia was making these crucial investments in CUDA and deep learning.


First, graphics is an "embarrassingly parallel" problem, which means you can keep throwing more silicon at it, and still get decent performance increases. If you plot performance per Transistor of GPUs, over time, it would be much more linear than with CPUs - that's for sure!

Also, I take issue with the assertion that PC components have really become that much cheaper. Increased integration helped reduce PC prices a lot. But, it seems like decent-performance CPUs never really got that cheap, and prices certainly have been on the rebound since the core-count race really got going.
In 2000 when Kyle Bennett was toasting CPU's for page views, and top of the line consumer CPU would set you back $1,200. Today, that cost has declined to under $700.

In 2000 ram cost $.90 per megabyte. Today it's $0.0015. Source

Premium sound is so cheap, it's pre-installed into virtually every computer motherboard you could buy. Show me a motherboard made today and doesn't at least offer 5.1 sound onboard.

Shall I go on? I got a lot more where that came from.

The only thing I've actually found that has gone up in price? The video cards. The only computer company that doesn't have any real competiton is nVidia and it shows. A top of the line 3d card back in the day cost $500. Today, I sold my soul for a 4080 at $1,200 and that isn't even top of the line!

If AMD was actually competitive to nVidia, graphics cards would cost a lot less.
 
In 2000 when Kyle Bennett was toasting CPU's for page views, and top of the line consumer CPU would set you back $1,200. Today, that cost has declined to under $700.
Comparing the top end gets tricky, because Intel tried to have an Extreme model line, but that was a very niche product and separated by a pretty wide gulf from the rest of their CPUs. That was essentially the equivalent of a HEDT CPU, though there was much less differentiating it than the modern understanding of the category.

In 2000 ram cost $.90 per megabyte. Today it's $0.0015. Source
Well, if that's how you're going to account for costs, then you should really be normalizing CPU costs by Dhrystone MIPS or something, and comparing the MIPS/$.

As a matter of fact, if you're going to price RAM by the megabyte, why aren't you pricing GPUs by the GigaFLOPS or GigaTexel/s?

Premium sound is so cheap, it's pre-installed into virtually every computer motherboard you could buy. Show me a motherboard made today and doesn't at least offer 5.1 sound onboard.
That's what I meant by integration leading to cheaper PC prices. You no longer need a separate sound card, network card, or IDE controller card, for instance.

The only thing I've actually found that has gone up in price? The video cards.
And mainstream CPUs. For a long time, you'd get an Intel i7 for about $300 - $350. Now, the equivalent - the i9 - costs about $600.

I think mid-market hard drives are also more expensive than they used to be, although they certainly did get cheaper than they were ~20 years ago.

If AMD was actually competitive to nVidia, graphics cards would cost a lot less.
Not sure how much, though. 10% or even 20%? Maybe. But, not like what you're talking about. Just look at the die sizes and production nodes. They're never going to be cheap.

Also, an increasing amount of the cost of these things is going into the software, which is orders of magnitude more complex than it was 2 decades ago.
 
Last edited:
Well... they were late in bringing their ray tracing and tensor/matrix cores to the party. Also, I think Nvidia beat them to the punch on mesh shaders. And I think Intel had packed arithmetic before either AMD or Nvidia.
In 2000 when Kyle Bennett was toasting CPU's for page views, and top of the line consumer CPU would set you back $1,200. Today, that cost has declined to under $700.

In 2000 ram cost $.90 per megabyte. Today it's $0.0015. Source

Premium sound is so cheap, it's pre-installed into virtually every computer motherboard you could buy. Show me a motherboard made today and doesn't at least offer 5.1 sound onboard.

Shall I go on? I got a lot more where that came from.

The only thing I've actually found that has gone up in price? The video cards. The only computer company that doesn't have any real competiton is nVidia and it shows. A top of the line 3d card back in the day cost $500. Today, I sold my soul for a 4080 at $1,200 and that isn't even top of the line!

If AMD was actually competitive to nVidia, graphics cards would cost a lot less.
They are - but GPU is a market where MINE'S BIGGER! is more important in communication than I CAN KEEP IT UP FOR AGES! In short, you can draw a parallel with the US car market : when people have some cash to spare, they buy big, loud and noisy to be better than the neighbour; when they don't, they make do, bu they sure as heck won't brag about it.
I recently retired my Radeon RX480 8Gb; at the beginning, my fellow gamers were all "LOL I got a Geforce 1080Ti it kills everything you puny mortal LOL you suxxx0rz". In 2020, they were all like "I sold my 2080 abecause I needed cash and I can't buy a new GPU , how am I gonna keep busy during lockdown because IGP can't play"...
And I played Doom Eternal @1440p, medium details Nightmare textures quite happily on my old GPU; I could have sold it for more than I had bought it new, but I considered it an investment for a solid gaming experience for the next 2-3 years... I sure as heck didn't expect to keep it in my main rig for 6 years !
 
Well, if that's how you're going to account for costs, then you should really be normalizing CPU costs by Dhrystone MIPS or something, and comparing the MIPS/$.
I will admit to one error on my argument.

I should have priced CPU's by the core. Since the Athlon I was comparing to had one one core, and the modern CPU's have multiple cores, that would have really shown how really stupid cheap CPU's are getting.
 
If AMD was actually competitive to nVidia, graphics cards would cost a lot less.

Not that much less. AMD is out to make money just like Nvidia.

The days of the $699 flagship card i.e. 1080 Ti in 2017 are long gone.
 
Amen.

I commented yesterday about the differences between AMD's top card and the 4090... the performance gap is pretty wide. Is that worth the difference in price? To me it is. I need a GPU that can do 4K ultra 60 fps without hiccups... and that's the 4090.

But according to the YouTube trolls I'm part of the problem for buying Nvidia's overpriced flagship card instead of pairing my 7950x3D with a 4060 Ti. 🤣
The issue I have is that you said "Hey AMD... if you make a better product... people will buy it."

You can't just define "whoever's top-of-the-line halo-product performs better" as a judgment across their entire product stack. That is an exceptionally narrow definition which covers only a tiny portion of the GPU consumer market.
 
  • Like
Reactions: bit_user
The issue I have is that you said "Hey AMD... if you make a better product... people will buy it."

You can't just define "whoever's top-of-the-line halo-product performs better" as a judgment across their entire product stack. That is an exceptionally narrow definition which covers only a tiny portion of the GPU consumer market.
I used to use and recommend AMD's graphics cards. I'm the guy who came up with the name fanATIc to describe myself and other fans of the Radeon brand.

Today, I use and recommend nVidia. If AMD were to put out a better product (I use their Ryzen CPU's) I definitely would buy it.
 
The issue I have is that you said "Hey AMD... if you make a better product... people will buy it."

You can't just define "whoever's top-of-the-line halo-product performs better" as a judgment across their entire product stack. That is an exceptionally narrow definition which covers only a tiny portion of the GPU consumer market.

Somebody beat me to my reply.


I used to use and recommend AMD's graphics cards. I'm the guy who came up with the name fanATIc to describe myself and other fans of the Radeon brand.

Today, I use and recommend nVidia. If AMD were to put out a better product (I use their Ryzen CPU's) I definitely would buy it.

Total agreement. ^


Don't believe what nVidia tells you. They price that stuff the way they do because they can, and they know people have to pay for it. What else is there?

You're not wrong. I know I sound like an Nvidia shareholder with my comments but I'm not... I just want the best hardware and while AMD is competitive with Intel hence my recent purchase of one of their processors for the first time in 22 years they unfortunately aren't competitive with Nvidia on the high end of GPUs.

If they were... I'd buy it too.
 
  • Like
Reactions: bit_user and Nyara
A750 is down to $230, was $210 a few days ago. That or the RX6600. I don't think much upcoming is going to meet your price requirements.
I know. I'm in Canada though, so any price variations in the USA will take a month or two to make their way across the border assuming they stick in the first place. And I meant $300 CAN. Taxes bring both the RX6600 and A750 close to $400 CAN.
 
I know I sound like an Nvidia shareholder with my comments but I'm not... I just want the best hardware and while AMD is competitive with Intel hence my recent purchase of one of their processors for the first time in 22 years they unfortunately aren't competitive with Nvidia on the high end of GPUs.
I have no real allegiance to any company, though I promised myself a couple of AMD builds (for the first time since the old Athlon days) - so I choose to make it at least a passing consideration that AMD is fighting on too many fronts as the "smallest" of the three companies - i.e. for CPU, dGPU, APU, Server and software, etc. To actually win on all fronts at the same time is probably asking a bit too much, though IMO they do have plenty of money to actually do better..

Whilst of course being disappointed that AMD drops the ball, and competition stagnates for another year, I'm not going to beat on them for it.

Likewise, my own last build was 2018, with a Threadripper 2950X and my wife's was a 5930K hand-me-down (both on Asus ROG boards). I was supposed to upgrade straight to a 3960X - but of course AMD abandoned the X399 platform. So, I'm still waiting, as I'm not impressed enough with the 13900K/S from Intel, and I'm a bit on the fence with the 7950X/3D. That said, I just built the wife an AM5 system around a 7700X and ASRock X670E Taichi Carrara - fully water cooled (AU$4000 without a GPU), and it's great that this little 8 core chip outclasses the old Threadripper. Even then, AMD (IMO), dropped the ball by making the IHS too thick in the name of cooler backward compatibility so we end up with spiky temperatures... C'est la vie!

I guess the point is - unless you suddenly die, or go bankrupt, or are the kind of person (no offense intended) that needs to buy best-bang-for-buck and make it last 5 years or more, then there'll be another new bit of hardware next year. I say, try other things, it's your hobby.
 
  • Like
Reactions: -Fran- and bit_user
You're not wrong. I know I sound like an Nvidia shareholder with my comments but I'm not... I just want the best hardware and while AMD is competitive with Intel hence my recent purchase of one of their processors for the first time in 22 years they unfortunately aren't competitive with Nvidia on the high end of GPUs.

If they were... I'd buy it too.
I think we all want the best hardware. I once used AMD in my desktop, my laptop, and anything else I could find. Then I switched to Intel for several cycles because they were better.

This last go around, I switched to AMD again. That Ryzen 9 7950x3d is neck and neck with the I9 in terms of performance, and it came down to energy efficiency. In the past that would not have mattered much, but after dealing with $300 utility bills, guess what won out? 🤣

If AMD put out a better graphics card, I'd switch in a heartbeat.