News AMD crowns the Ryzen 7 9800X3D a ‘gaming legend’ in a surprise announcement — chipmaker claims $479 Zen 5 3D V-Cache chip is up to an average 20% f...

"this is 2nd Gen AMD 3D V-Cache technology"
This would be 3rd gen, no? The first being the 5800x3d
Technically, they're not wrong. The TSVs and how the slapped the VCache on top on both 7800X3D and 5800X3D is the same. I believe even the cache itself is the same exact one.

The TSVs for this one was reworked and it does deserve the "2nd gen" moniker being fair to them.

Regards.
 

KraakBal

Reputable
May 23, 2020
71
43
4,560
Technically, they're not wrong. The TSVs and how the slapped the VCache on top on both 7800X3D and 5800X3D is the same. I believe even the cache itself is the same exact one.

The TSVs for this one was reworked and it does deserve the "2nd gen" moniker being fair to them.

Regards.
Are you sure about that? The 5800x3d is @ 7nm die, while zen 4 is 5nm.
Different dies, different TSV sizes and layout.
This video from High Yield goes into detail with microscope shots:
View: https://youtu.be/bPLKa4crk8A
 
  • Like
Reactions: Avro Arrow
Are you sure about that? The 5800x3d is @ 7nm die, while zen 4 is 5nm.
Different dies, different TSV sizes and layout.
This video from High Yield goes into detail with microscope shots:
View: https://youtu.be/bPLKa4crk8A
Yes, I'm sure.

Ugh, let me find the information.

EDIT: https://www.tomshardware.com/news/what-7800x3d-looks-like-with-infared

The cache in Zen4 is the same 7nm cache of Zen3 and the difference was they had to put the TSVs on top of the L2 cache as well and cover more of the die with it. I think they may have improved the arrangemnt of the SRAM die a bit, but overall it's the same.

The 7800X3D is more of a 1.5 gen than gen 2 if you want to go there, I guess. So it'll depend on how many variables you want to account for and which ones determine what is a "leap" or just "improvement" from previous.

EDIT2: Dr. Ian Cutress has a great video explaining why this does deserve the "gen 2" moniker. Very worth a watch alongside the link I pasted above:

View: https://www.youtube.com/watch?v=4pGDEYApniU


Regards.
 
Last edited:

Giroro

Splendid
So... Which definition of legend do we think AMD is using:

LEGEND
noun
  1. a nonhistorical or unverifiable story handed down by tradition from earlier times and popularly accepted as historical.
    Antonyms: fact
  2. the body of stories of this kind, especially as they relate to a particular people, group, or clan:
    the winning of the West in American legend.
  3. an inscription, especially on a coat of arms, on a monument, under a picture, or the like.
  4. a table on a map, chart, or the like, listing and explaining the symbols used. Compare key 1( def 8 ).
  5. Numismatics. inscription ( def 8 ).
  6. a collection of stories about an admirable person.
  7. a person who is the center of such stories:
    She became a legend in her own lifetime.
  8. Archaic. a story of the life of a saint, especially one stressing the miraculous or unrecorded deeds of the saint.
  9. Obsolete. a collection of such stories or stories like them.

It's probably the first one, right? You know... the definition that literally means they can't prove their claims.​
 
I may be missing it, but what resolution were those AMD tests run at? 1080P? Hopefully we'll get some better tests from reviewers in the near future.
Almost guaranteed they were using 1080p, with a slim chance of 720p. There is little to no point in testing higher resolutions because then you become GPU bound and are not able to determine which CPU is faster than another.

Maybe gpu bound. They aren't using a 4090 but a 7900xtx.
What do you think of AMD's claim that the 9800X3D is 59% faster in Cyberpunk 2077 compared to a 285K? I think the only way that could be possible is if they used very specific areas to benchmark the game. Who knows if they used the in-built benchmark tool for a more fair comparison.

M6Gh2N9C7Gg3vuqcu5oX27-970-80.jpg.webp
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Almost guaranteed they were using 1080p, with a slim chance of 720p. There is little to no point in testing higher resolutions because then you become GPU bound and are not able to determine which CPU is faster than another.


What do you think of AMD's claim that the 9800X3D being 59% faster in Cyberpunk 2077 than a 285K? I think the only way that could be possible is if they used very specific areas to benchmark the game. Who knows if they used the in-built benchmark tool for a more fair comparison.

M6Gh2N9C7Gg3vuqcu5oX27-970-80.jpg.webp
A lot of their numbers don't make sense compared to the 285k reviews. For example in Starfield, according to HUB the 285k is the fastest chip, according to this the 9800x 3d is 31% faster. Maybe the very slow ram they are using negatively impacts the 285k.
 
A lot of their numbers don't make sense compared to the 285k reviews. For example in Starfield, according to HUB the 285k is the fastest chip, according to this the 9800x 3d is 31% faster. Maybe the very slow ram they are using negatively impacts the 285k.
Do you think maybe it has anything to do with them using a 7900 XTX over a 4090? There is more CPU overhead with the Nvidia drivers, right? Who knows, we will just have to wait for the reviews.

If I had to guess, I would expect the 9800X3D to be about 5% faster than the 7800X3D clock for clock because thats about how much faster a 9000 CPU is over a 7000 CPU. If you add in the clock increases from the 7800X3d to the 9800X3D it should only be another 8-12%. So the 9800X3D should theoretically be about 17-19% faster than the 7800X3D. I dont understand how that translates to these numbers in an actual use case.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Do you think maybe it has anything to do with them using a 7900 XTX over a 4090? There is more CPU overhead with the Nvidia drivers, right? Who knows, we will just have to wait for the reviews.
Could be anything, maybe there are areas in starfield that performance plummets on 285k like in cyberpunk, really can't tell until I get my hands on one and test it myself.

If I had to guess, I would expect the 9800X3D to be about 5% faster than the 7800X3D clock for clock because thats about how much faster a 9000 CPU is over a 7000 CPU. If you add in the clock increases from the 7800X3d to the 9800X3D it should only be another 8-12%. So the 9800X3D should theoretically be about 17-19% faster than the 7800X3D. I dont understand how that translates to these numbers in an actual use case.
Clockspeed increase doesn't directly translate to performance. Games are primarily memory / cache bound. Having the CPU running faster but with no data to process doesn't do much. For example, it's a very very old video of mine, just go to the end and check the silly graph I made. Giving the CPU more data to chew on performs a lot better than just having less data but chewing faster. 20% difference in clockspeeds resulted in 4% performance increase, lol

 
Maybe the very slow ram they are using negatively impacts the 285k.
The slides at the end of the presentation show the 285k with 32GB DDR5-6400 so that isn't "very slow RAM" by any means.

Do you think maybe it has anything to do with them using a 7900 XTX over a 4090? There is more CPU overhead with the Nvidia drivers, right? Who knows, we will just have to wait for the reviews.
The slides actually show that some tests were done using the RTX4090 in October. It looks like most of the games tested for 9800X3D vs 285k were being done on the 4090 and games between the 7800X3D and 9800X3D were on the 7900XTX.

Going to the end of the Twitter presentation has the specs for all systems.
View: https://x.com/JackMHuynh/status/1851970613604126789
 
Of course it is. That's the minimum guaranteed ram speed for core ultra.
Using CU-DIMM; non clocked U-DIMMs it's still 5600MT/s. So AMD actually gave them a "better than worse" memory config. The real question is if AMD was using 5600MT/s or the same CU-DIMM speed.

Remember AMD doesn't officially supports CU-DIMM, but their IMC can handle it (they literally said "tolerate").

I'll have to read the fine print of the whole thing. Ugh... So annoying.

Regards.
 
Of course it is. That's the minimum guaranteed ram speed for core ultra.
6400MHz isn't minimum guaranteed RAM speed that is the MAXIMUM guaranteed RAM speed. Even Intel's Ark says Maximum Memory Speed: 6400MHz. https://www.intel.com/content/www/u...-36m-cache-up-to-5-70-ghz/specifications.html

While Tom's Hardware was able to test with 7200MHz UDIMMs that doesn't mean that everyone will get that speed. Odds say that the Ultra 9's have the best chance of hitting RAM speeds beyond 6400MHz but again that isn't guarantee. I've seen people complain that they couldn't get 6000MHz RAM to work on an Ryzen 7600 at rated speeds but would work at 5600MHz. Well they didn't win the silicon lottery so their RAM controller couldn't go to 6000MHz. This is why even for Ryzen 9000 series the maximum guaranteed speeds is only 5600MHz. Therefore AMD testing at 6400MHz isn't using "slow RAM" despite what you think.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
6400MHz isn't minimum guaranteed RAM speed that is the MAXIMUM guaranteed RAM speed. Even Intel's Ark says Maximum Memory Speed: 6400MHz. https://www.intel.com/content/www/u...-36m-cache-up-to-5-70-ghz/specifications.html

While Tom's Hardware was able to test with 7200MHz UDIMMs that doesn't mean that everyone will get that speed. Odds say that the Ultra 9's have the best chance of hitting RAM speeds beyond 6400MHz but again that isn't guarantee. I've seen people complain that they couldn't get 6000MHz RAM to work on an Ryzen 7600 at rated speeds but would work at 5600MHz. Well they didn't win the silicon lottery so their RAM controller couldn't go to 6000MHz. This is why even for Ryzen 9000 series the maximum guaranteed speeds is only 5600MHz. Therefore AMD testing at 6400MHz isn't using "slow RAM" despite what you think.
Of course they are using slow ram for Intel when their testing their chips with 6000c30. Man, just try to be objective for once, it ain't gonna harm you.
 
While I don't doubt that the R7-9800X3D will outperform the R7-7800X3D, recent events have made me doubt AMD's marketing team. I remember when Ryzen first came out and AMD had adopted a "just tell the truth" approach when it came to marketing. I remember when review sites actually gave credence to AMD's performance claims because they always turned out to be accurate. This approach to marketing worked so well for them that it boggles my mind that they abandoned this approach. If people don't trust your word, it just hurts you as a company.