News AMD Posts First Loss in Years as Consumer Chip Sales Plummet

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Polypheme
Ambassador
You showed the 7900 that has no 5900 counterpart so what exactly are you showing there other than overclocking makes efficiency bad?
No, it shows that Zen 4 isn't fundamentally inefficient. If there existed a 7950, that would've been ideal. Sadly, AMD doesn't make one, so we either look to the fastest non-X 7000-series model, which is the 7900, or we need to look at benchmarks like Anandtech's or Computerbase's, where they configured a lower TDP in the BIOS.

BTW, there is a 5900 (non-X).


the 5900x which in your words has worse efficiency since it's an X part, having the exact same efficiency as the overclocked 7900 non-x part....
First, I'm sure you're aware that AM4 has lower power limits than AM5. Consequently, the 5900X is much more limited than the 7900X. Therefore, the 5900X isn't as far outside of its sweet spot as the 7900X.

In more general terms, you shouldn't expect X to mean the exact same thing or have the exact same impacts across generations. It's common sense, really. Maybe not to trolls...

And 45% above the 5900x result would be 31000 points which needs the 7900 non-x to be overclocked to still not quite reach.
You're not understanding what I said before, which is that I only needed to show that Zen 4 offers a significant performance improvement in at least one case. It needn't be the same case in which it offers an efficiency benefit. That was never stipulated.

Hey, all I have to do is show even one case where that's not true, in order to invalidate the assertion.
The assertion you claim to be challenging is one that was imagined by no one but yourself.

"Zen 4 is a relatively small gain over Zen 3 for a much higher cost"
You keep cutting that last part out to move the goalpost.
If part of the statement is untrue, then the whole statement is invalidated.

You showed that it has better performance I showed that the cost for it is much higher.
You created your own definition of "cost". Moreover, why are you trying to compare the performance of Zen 4 @ 65 (88) W with Zen 3 at 105 (141) W? That's by no means a sensible comparison!
 
Last edited:
Well it's gone way north of being about AMD lost revenue.

These always end up AMD vs Intel fans.
Unless it's certain that differences in performance and efficiency and anything else for that matter has nothing to do with how well a product sells then it's all on topic.

I mean the only other thing we could be discussing is how much the weakened market has to do with it.
But the market was fine during 2017-2020 and AMD was still making very little money.
 

ottonis

Reputable
Jun 10, 2020
166
133
4,760
I remember reading somewhere that laptop ODMs require a lot more hand-holding and support than other types of platforms. I think AMD has been ramping up their internal resources needed to support this segment, but it takes a while to do.


I think they have put a lot into video compression performance on newer codecs, as we can clearly see from @JarredWaltonGPU 's excellent comparison, earlier this year:

If you look at H.265 and especially AV1, AMD has really closed the quality gap vs. Nvidia. Unfortunately, they don't seem to have revisited H.264 to improve quality of that codec. Performance-wise, they now appear to be in the lead.
ddBburCVSTCTuWjNLAKvsL.png
Y6b45ayYrXdvuZev6a97XL.png

The problem AMD might now face is convincing app developers to support their hardware? I suspect many apps are using Nvidia's native APIs, rather than going through DirectShow.
Many thanks for this very informative post! I am glad that AMD has worked on their video encoding engines, but in order to take away a piece of the "cake" from nVidia, it probably won't suffice to just close the gaps but to actually defeat nVidia by quite a significant margin in terms of quality and performance.
They might even introduce a special line of GPUs specifically optimized for video editing, since the content creator market is much less cost-sensitive than the gaming market. Someone who earns their money by editing videos every day, will be happy to pay 30% more for a card that will cut his encoding times in half while improving on video quality and file size at the same time.

And you are totally right with regards to the laptop market: these OEMs really need lots of hand holding, and I can only hope that AMD recognizes how important this market segment is and that it will more than just pay off, if they ramp up their R&D and all other efforts of handholding in this segment.
 
  • Like
Reactions: bit_user
Someone who earns their money by editing videos every day, will be happy to pay 30% more for a card that will cut his encoding times in half while improving on video quality and file size at the same time.
Science has spent centuries on creating the 'egglaying woolmilkpig' ,hasn't worked yet.
Some things are just unfeasible and the ones that will get closest to it are the ones with the most money for research and not the ones that want it the most.
 

bit_user

Polypheme
Ambassador
Many thanks for this very informative post!
All credit goes to Jarred, who put in the work, did the tests, and wrote the article.

I am glad that AMD has worked on their video encoding engines, but in order to take away a piece of the "cake" from nVidia, it probably won't suffice to just close the gaps but to actually defeat nVidia by quite a significant margin in terms of quality and performance.
It's important to understand exactly what he tested. From the intro:

"For our test software, we've found ffmpeg nightly to be the best current option. It supports all of the latest AMD, Intel, and Nvidia video encoders, can be relatively easily configured, and it also provides the VMAF (Video Multi-Method Assessment Fusion) functionality that we're using to compare video encoding quality. ...

We're doing single-pass encoding for all of these tests, as we're using the hardware provided by the various GPUs and it's not always capable of handling more complex encoding instructions. GPU video encoding is generally used for things like livestreaming of gameplay, while if you want best quality you'd generally need to opt for CPU-based encoding with a high CRF (Constant Rate Factor) of 17 or 18, though that of course results in much larger files and higher average bitrates."

If you want better quality, single-pass and ffmpeg might not be the place you want to start.

They might even introduce a special line of GPUs specifically optimized for video editing, since the content creator market is much less cost-sensitive than the gaming market.
All 3 brands make workstation cards, if you want to pay more. Traditionally, Nvidia has nerfed the NVENCs in consumer GPUs, but I recently recall reading they started enabling more capabilities. That might just have been a limit on the number of concurrent encodes, rather than restricting single-stream encoding throughput. I don't know whether AMD or Intel similarly placed any restrictions on the video encoding performance of their non-workstation cards, but I sort of doubt it.
 

hannibal

Distinguished
If you hollow out the entry-level such that nobody can be bothered to try PC gaming beyond what can run on IGPs, you are likely destroying future prospects at the mid-to-high end.

Could, but the point is that most PC gamers have huge steam account. So they will play PC games also in the future. What is happening is that the rate when they upgrade and what they upgrade will change.
People will keep their GPUs longer time, people also move to low tier GPUs or try to buy a couple of generations old hardware.
This definitely slow down the development of GPUs and how the games develop, but it does not end PC gaming.
I was sure that I will replase my current 5700XT at 2023 or 2024… Noup, not gonna do that. Now the time table seems to be 2025 or 2026. And I don´t expect to see better prices by then! But I expcect that I get desent speed pump also from xx60 or x600 series by then. Or that I will buy xx70 or x700 level GPU (most likely $1000 by then) that will be used many, many years…
 

sitehostplus

Honorable
Jan 6, 2018
380
156
10,870
What debacle? Nvidia's stocks are at one year record, while everyone and Elon Musk are buying their GPUs to make and deploy AI models... at this rate their data center's revenues will more than double their gaming GPU's soon enough.
Here, let's put the brakes on that a little, shall we?

nVidia not doing as well as you think,

They got the same problems the rest of the tech industry does right now. Slacking demand, soaring inventory.

The only reason nVidia's stock has been going up is the hype surrounding AI, which they are a major player in. But that doesn't translate into sales too much right now, especially with what could be a nasty recession coming.
 
  • Like
Reactions: bit_user

RedBear87

Commendable
Dec 1, 2021
153
120
1,760
Here, let's put the brakes on that a little, shall we?

nVidia not doing as well as you think,

They got the same problems the rest of the tech industry does right now. Slacking demand, soaring inventory.

The only reason nVidia's stock has been going up is the hype surrounding AI, which they are a major player in. But that doesn't translate into sales too much right now, especially with what could be a nasty recession coming.
Even assuming that the analysis is correct, I would disagree on AI not translating in sales right now, it still wouldn't say anything about a specific and peculiar Nvidia's debacle caused by its "obscene prices", like PEnns above was arguing...
 

cyrusfox

Distinguished
The only reason nVidia's stock has been going up is the hype surrounding AI, which they are a major player in. But that doesn't translate into sales too much right now, especially with what could be a nasty recession coming.
We will see on May 24th when they announce quarterly earnings... But with A100 being peerless and CUDA established as well as ChatGPT craze, I would not bet against NVIDIA nor would I be surprised if they still turn a profit while AMD and Intel reported losses this quarter.
 

sitehostplus

Honorable
Jan 6, 2018
380
156
10,870
We will see on May 24th when they announce quarterly earnings... But with A100 being peerless and CUDA established as well as ChatGPT craze, I would not bet against NVIDIA nor would I be surprised if they still turn a profit while AMD and Intel reported losses this quarter.
If they report a profit at all, expect a slim one.

AI is expected to be huge down the road, not immediately. And there is a niggling problem with growing unsold inventory, which sounds about right in light of the reported bombed launch of the 4070.

I suspect I'm going to be kicking myself for buying a 4080 so soon. 😭
 

sitehostplus

Honorable
Jan 6, 2018
380
156
10,870
If anything, they'll probably just shave like $100 off of it. All you need to do is get $100 worth of usage out of it, between the time you bought it and that price cut. If you do that, then no need to kick yourself.
Tbh, I see some major pain coming for the economy soon.

This inflation fight is going to carry on for a lot longer than most people expect, and there will fallout agogo from it.

In that climate, $100 price cuts won't be enough. And I wouldn't expect any such cuts unless there is a dire need to make room for newer chips.

I'll have at least 1 years worth of usage out of this board before they start cuts. And I will still kick myself for not buying a $200 card to wait it out. 🤗