News AMD teases Ryzen 9000X3D chip coming November 7, cuts pricing on all other Ryzen 9000 chips

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Peksha

Prominent
Sep 2, 2023
45
33
560
Take your low effort lame trolling somewhere else. It's just embarrassing, this isn't reddit. Both games I played at 5120x1440 super-ultrawide, no DLSS, Ultra settings with Ray Tracing at over 100FPS on a 4090. Not that anyone with any clue would need to be told.
Path Tracing is disabled in your configuration
 
Worked on me :ROFLMAO:

Which is why I rejected the entire series and why I'm worried they'll do it again with the 50 series. The 4060 / 4070 were deliberately handicapped on memory bus width and the 4080 was overpriced. Like you didn't want to touch the 4060/4070 due to 128~196 bit memory bus and if you want to buy the 4080 (256 bit) might as well buy the 4090 (384 bit) then. It's making the halo card seem attractive only by making every other card unattractive. The 4090 is a titan card with a different model name and same obscene price.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
Which is why I rejected the entire series and why I'm worried they'll do it again with the 50 series. The 4060 / 4070 were deliberately handicapped on memory bus width and the 4080 was overpriced. Like you didn't want to touch the 4060/4070 due to 128~196 bit memory bus and if you want to buy the 4080 (256 bit) might as well buy the 4090 (384 bit) then. It's making the halo card seem attractive only by making every other card unattractive. The 4090 is a titan card with a different model name and same obscene price.
Of course they will do it again with the 5 series. Isn't the gaming segment just an afterthought for nvidia right now? From a financial standpoint even a 4090 doesn't make sense for them
 
Which is why I rejected the entire series and why I'm worried they'll do it again with the 50 series. The 4060 / 4070 were deliberately handicapped on memory bus width and the 4080 was overpriced. Like you didn't want to touch the 4060/4070 due to 128~196 bit memory bus and if you want to buy the 4080 (256 bit) might as well buy the 4090 (384 bit) then. It's making the halo card seem attractive only by making every other card unattractive. The 4090 is a titan card with a different model name and same obscene price.
Also, if I recall correctly, that the original "4080 12GB" was going to be what is now the 4070 ti 12GB and they were going to price it at 900 dollars.
 
  • Like
Reactions: Thunder64
Of course they will do it again with the 5 series. Isn't the gaming segment just an afterthought for nvidia right now? From a financial standpoint even a 4090 doesn't make sense for them
I think they keep around the gaming segment to dump their poor silicon production chips, and as a fallback for when everyone realizes the "AI" branding on every product that has a trace is useless. Just to be clear, even actual AI and its real implementations are a very dubious value proposition considering the trillions spent on AI around the world.
 

pug_s

Distinguished
Mar 26, 2003
482
76
18,940
Both AMD and Intel have a problem with their respective launches: cheap-ish 7800X3Ds and 5700X3D/5800X3D.

AMD needs to think really hard on the price, otherwise it'll be another value clowns launch like Zen5 was.

3D is, first and foremost, a gaming CPU and value is king there. So I'd expect AMD will be able to read the room from all the Zen5 criticism. Unlikely given recent past history.

At least, I hope they now know how to test :D

Regards.
AMD is being greedy at this point and their profit margin is even higher than Apple. Tech TechPotato's channel says that it cost $69 to produce a 7950x. The 9000 cpu's will definitely cost more and are geared towards datacenters so their profit margins are even higher than consumer cpu's. So the consumer cpu's will get last dibs.

Personally, I'm not into the latest and greatest hype train. I just upgraded to a Ryzen 5800x and I am happy with it. I wouldn't be surprised that it cost about $40 or less to make that cpu.
 
I'd rather buy one of these than to buy a PS5 Pro. The only issue is the timing of the RTX 5 Series GPU's. As those are expected to drop early next year and could easily turn into the middle of next year. I'm looking to build this time instead of going the pre-built route. I seriously doubt HP, Lenovo or Alienware will offer the X3D series chips. Perhaps Corsair but that will probably turn into a $4,000+ machine.
I wouldn't buy from HP, or Alienware. They make good looking cases and the cases are designed to hold the video card well for shipping. But they literally don't seem to care about proper cooling or airflow. When gamer's nexus took apart an alienware rig, it blew my mind that the cpu cooler wasn't even rated to handle the tdp of the processor they installed. Which means it'll throttle constantly. That's more than an oversight, they just don't care.

Lenovo towers look decent though.

Preferably I'd rather have Microcenter build one for me if I couldn't do it myself. All the components will be swappable with off the shelf parts.
 
AMD is being greedy at this point and their profit margin is even higher than Apple. Tech TechPotato's channel says that it cost $69 to produce a 7950x. The 9000 cpu's will definitely cost more and are geared towards datacenters so their profit margins are even higher than consumer cpu's. So the consumer cpu's will get last dibs.

Personally, I'm not into the latest and greatest hype train. I just upgraded to a Ryzen 5800x and I am happy with it. I wouldn't be surprised that it cost about $40 or less to make that cpu.
Well, two very important things you my need to think a bit harder on:

1.- Greed.
If a Company is not "greedy" then they'd be a charity or non-profit organization. So, under that token, don't kid yourself: AMD, Intel and nVidia are all the same animal. The only difference is the level of success they have. And this is true with any other Company out there. So, think a bit more about the "greedy" moniker.
-

2.- BOM Costs
His napkin math could may be very real, but the cost of the materials is just one part of the whole chain that makes the full product. Think about chain of supply from build/assembly to each distributor/seller, then the marketing (leaving the quality of it outside :D) and even additional costs not factored into his initial analysis which are R&D related.
-

The value proposition of each CPU is very personal. AMD says "I think this is how much you value this" and, in the case of Zen5's launch, we all said "well, Zen4 is not really far behind and real prices are really low with no problem with supply, so no thanks". And voilá: price drops. Now the value proposition is better and, maybe, will sway more pees into buying Zen5 now. The new 3D chips will be judged under a very similar umbrella, hence my comment.

Overall, I don't think you're wrong, but I think having some additional context will put your thoughts into a better place and maybe validate them as well? Well, that's what I think at least. As I said, I don't think you're wrong.

Regards.
 

TeamRed2024

Upstanding
Aug 12, 2024
199
131
260
The price cuts on the other Ryzen 9000 processors are a nice bonus . I’ve been wanting the Ryzen 9 9950X for a while so this might be the perfect time to upgrade. Has anyone here already tried the 9950X? How do they compare to Intel 16 cores?

Mine hasn't degraded.


I wouldn't buy from HP, or Alienware. They make good looking cases and the cases are designed to hold the video card well for shipping. But they literally don't seem to care about proper cooling or airflow. When gamer's nexus took apart an alienware rig, it blew my mind that the cpu cooler wasn't even rated to handle the tdp of the processor they installed. Which means it'll throttle constantly. That's more than an oversight, they just don't care.

Hasn't been an issue. I've even got a laptop cooler in my Amazon saved for later... just in case. Haven't had a reason to purchase it.

Well, two very important things you my need to think a bit harder on:

1.- Greed.
If a Company is not "greedy" then they'd be a charity or non-profit organization. So, under that token, don't kid yourself: AMD, Intel and nVidia are all the same animal. The only difference is the level of success they have. And this is true with any other Company out there. So, think a bit more about the "greedy" moniker.

Haha... yep. I have no idea why people use the "corporate greed" comment. ALL for profit companies are out to make money.

It's the same reason I go to work every day... and work a ridiculous amount of OT... because I'm a greedy *#&#& and the money is good.
 
Also, if I recall correctly, that the original "4080 12GB" was going to be what is now the 4070 ti 12GB and they were going to price it at 900 dollars.

Yes the 4070, with it's handicapped memory bus, was originally going to be marketed as the 4080 with the 4090 being the only "reasonable" choice. The sheer amount of negativity and outrage from the customer base had then backpedal and relabel it the 4070. The entire 40 series launch is just trash.

The real speak is that nVidia saw how much people were willing to pay for a GPU during the mining crazy when people were forced to pay 50~100% above MSRP. nVidia saw that and said "well why can't that be mine" and thus the entire 40 series pricing structure was created. There is only "Halo" and "things that make you want to buy Halo" products. AMD seriously missed an opportunity to target that market segment. They didn't need to beat the 4090, only beat the 4060/4070 and 4080 in price to performance. Instead they tried to play nVidia's game by up charging their own products.
 

Thunder64

Distinguished
Mar 8, 2016
201
286
18,960
AMD is being greedy at this point and their profit margin is even higher than Apple. Tech TechPotato's channel says that it cost $69 to produce a 7950x. The 9000 cpu's will definitely cost more and are geared towards datacenters so their profit margins are even higher than consumer cpu's. So the consumer cpu's will get last dibs.

Personally, I'm not into the latest and greatest hype train. I just upgraded to a Ryzen 5800x and I am happy with it. I wouldn't be surprised that it cost about $40 or less to make that cpu.

The production costs are nothing compared to the massive R&D costs, along with others as mentioned above. Sure AMD makes money on their CPU's, especially on the higher end. They are a business after all. If you think their profit margins are unreasonable (despite no one knowing them) you have a choice not to buy from them.
 
  • Like
Reactions: TeamRed2024
May 28, 2024
143
82
160
9950X is a beast. I don't see any scenario where I would regret not waiting on the x3D chips. I read all the reviews on the gaming king 7800x3D and noted how it was horrible at productivity tasks.

That was my thinking in getting the 9950X. Video editing/encoding tasks would perform best with this chip... and since I game in 4K which to my knowledge is more GPU dependent I don't see much use for the x3D. 9950X can game just as well as any other modern processor. A few more/less fps doesn't much matter to me. I'd rather get productivity stuff done faster. I'd notice that more than a few fps differences.
That's my thing too. While I do like seeing tech like V-cache and how it's something pushing tech forward (even though it is older now lol). My needs are workflow first and gaming second. So I'm planning an AM5 build with an X SKU over X3D. But I am getting intrigued by AMD increasing the clocks on the X3D parts! Maybe in another gen or 2 the premium for X3D will be worth it for me. But for now, the X SKUs are where it's at for me. How's that 9950x been for you? Did the updates to AGESA and all that improve anything for you? I might actually go for the high-end CPU this time.
 
  • Like
Reactions: TeamRed2024
May 28, 2024
143
82
160
Productivity is exactly why I wouldn't consider X3D chips as they stand now. They're kings of gaming, but otherwise are mediocre and can be equaled in efficiency and/or surpassed in performance by their non-X3D counterparts. However if the 9000 series X3D parts are able to get close to the same clocks as non I'd seriously consider it as I do care about high minimum frame rates in every action title I play.
That's what I'm excited for. The clocks being higher would definitely make me think harder about X3D over X SKUs for my new build. Right now, I need and value the higher clocks on X vs the cache on X3D. But if X3D can bring those clocks up, I'm here for it! Not long to wait for reviews now.
 
  • Like
Reactions: thestryker

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
That's what I'm excited for. The clocks being higher would definitely make me think harder about X3D over X SKUs for my new build. Right now, I need and value the higher clocks on X vs the cache on X3D. But if X3D can bring those clocks up, I'm here for it! Not long to wait for reviews now.
X3d is useless unless you are gaming at 720p low. Just get a higher tier non 3d and enjoy the extra cores. That's why im still stuck with my 12900k, even the 4090 is slow and a huge bottleneck. In the super lightweight games that the 4090 isn't a bottleneck (valorant cs and the likes) im already hitting over 500 fps. It's just tail chasing pointlessness.

And it's not like the non 3d zen5 can't play games, lol, im sure they are great at that too.
 
May 28, 2024
143
82
160
X3d is useless unless you are gaming at 720p low. Just get a higher tier non 3d and enjoy the extra cores. That's why im still stuck with my 12900k, even the 4090 is slow and a huge bottleneck. In the super lightweight games that the 4090 isn't a bottleneck (valorant cs and the likes) im already hitting over 500 fps. It's just tail chasing pointlessness.

And it's not like the non 3d zen5 can't play games, lol, im sure they are great at that too.
Yeah, I know... I just tend to over do tech stuff! Lol... I really don't need V-cache. I can save on an X SKU and put the savings towards a better GPU. I'm also waiting to see what Intel's Battlemage brings. I'm cautiously hopeful for a mid-high end tier card (think 4070ti performance). I'm not excited at all about the 5070 coming with only 12gb VRAM. I'm hoping Intel has better options in that tier of card.
 

YSCCC

Commendable
Dec 10, 2022
578
464
1,260
X3d is useless unless you are gaming at 720p low. Just get a higher tier non 3d and enjoy the extra cores. That's why im still stuck with my 12900k, even the 4090 is slow and a huge bottleneck. In the super lightweight games that the 4090 isn't a bottleneck (valorant cs and the likes) im already hitting over 500 fps. It's just tail chasing pointlessness.

And it's not like the non 3d zen5 can't play games, lol, im sure they are great at that too.
There are games that is hindered by the main thread and cache, just not the ones you play maybe, for the same reasoning the extra cores are pretty useless also as 90% of softwares don't benefit with the extra cores outside of benchmarks, it all depends on what software one is predominantly using
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
There are games that is hindered by the main thread and cache, just not the ones you play maybe, for the same reasoning the extra cores are pretty useless also as 90% of softwares don't benefit with the extra cores outside of benchmarks, it all depends on what software one is predominantly using
Like what games? Can you give me one example?
 

YSCCC

Commendable
Dec 10, 2022
578
464
1,260
Like what games? Can you give me one example?
previously in another post, I've mentioned multiple times, that like MSFS 2020 benefits a lot with X3D chips. It's simply because, it have a lot of environmental data to be rendered by the CPU and processed, like wind direction etc. which are NOT handled by GPU, and that is only the base game, with the more serious addons with deep system simulation, it will use up a lot of the CPU for the systems calculations also, e.g. how the auto pilot response to wind gust. Games like this with high variability and need to be compute real time will benefit from a X3D chip more than extra cores, especially it impacts when massive inrush of real time data in weather or so.

Early review results:
https://forums.flightsimulator.com/t/7800x3d-performance-reviews-for-msfs/585520/1

And one of the best thing the game offers is that there is an in built developer mode, and it will show up things like FPS and what limits the FPS, which is 99% main thread limited, not thread no. limited, not GPU, but main thread.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
previously in another post, I've mentioned multiple times, that like MSFS 2020 benefits a lot with X3D chips. It's simply because, it have a lot of environmental data to be rendered by the CPU and processed, like wind direction etc. which are NOT handled by GPU, and that is only the base game, with the more serious addons with deep system simulation, it will use up a lot of the CPU for the systems calculations also, e.g. how the auto pilot response to wind gust. Games like this with high variability and need to be compute real time will benefit from a X3D chip more than extra cores, especially it impacts when massive inrush of real time data in weather or so.

Early review results:
https://forums.flightsimulator.com/t/7800x3d-performance-reviews-for-msfs/585520/1

And one of the best thing the game offers is that there is an in built developer mode, and it will show up things like FPS and what limits the FPS, which is 99% main thread limited, not thread no. limited, not GPU, but main thread.
Even the slowest cpu on that Tom's hardware graph is hitting 120. Do you think that he can't play msfs with zen 5 decently and he needs a 3d chip?
 

YSCCC

Commendable
Dec 10, 2022
578
464
1,260
Even the slowest cpu on that graph is hitting 120. Do you think that he can't play msfs with zen 5 decently and he needs a 3d chip?
What? where is the derailment again here? I am replying on the claim that except playing at 720P low, a X3D CPU don't have any real benefit, and the answer is: there are games that did get CPU bottlenecked even at 2k+ resolution, period.

And remember that it DEPENDS on what one use the CPU most on stuffs, if you do productive rendering etc along with games, things like the normal X version or the ARL will be better than a 7800X3D/9800X3D, but if one only plays games, the X3D will be better option as extra cache will be always better than some unused slower cores. Opposite is also true if one don't game much and do production rendering etc. get the most threads without too much giving up on single core IPC.

And this time round it seems the 9800X3D have a massive clock speed bump, so if they somehow able to retain most of the clock speed in X3D, so it could really be a best all arounder.

It's stupid to say that 8 core with Vcache is the best option for all use cases, and vice versa to claim that it is mere gimmick and one would always better off getting say a 12900k or 7950X instead is equally stupid.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
What? where is the derailment again here? I am replying on the claim that except playing at 720P low, a X3D CPU don't have any real benefit, and the answer is: there are games that did get CPU bottlenecked even at 2k+ resolution, period.

And remember that it DEPENDS on what one use the CPU most on stuffs, if you do productive rendering etc along with games, things like the normal X version or the ARL will be better than a 7800X3D/9800X3D, but if one only plays games, the X3D will be better option as extra cache will be always better than some unused slower cores. Opposite is also true if one don't game much and do production rendering etc. get the most threads without too much giving up on single core IPC.

And this time round it seems the 9800X3D have a massive clock speed bump, so if they somehow able to retain most of the clock speed in X3D, so it could really be a best all arounder.

It's stupid to say that 8 core with Vcache is the best option for all use cases, and vice versa to claim that it is mere gimmick and one would always better off getting say a 12900k or 7950X instead is equally stupid.
Bud, you posted a single game. There are other games that the 3d is a lot slower. On average, a 7950x is within spitting distance of a 7800 x3d, even at 720p, even with a 4090.
 

YSCCC

Commendable
Dec 10, 2022
578
464
1,260
Bud, you posted a single game. There are other games that the 3d is a lot slower. On average, a 7950x is within spitting distance of a 7800 x3d, even at 720p, even with a 4090.
that is why I said depends on what you do, and a platform, except for those upgrade every gen guys, is worth considering longevity as a whole. I post one game only, because that's the main game I myself plays, and tried to optimize the system to it, so I can say definitly it is CPU bounded, (which is also what you asked for if you lost your memory, you ask to name A game)

As that is a MS game, and a recent-ish one with constant updating, it implies a whole set of titles can have similar demand, when it ever need to simulate anything random/updating in background using CPU.

And it's just as I said in prior post, it depends on what one is going to use the rig for, if it is a gaming only rig, 8 cores 16 threads isn't limiting, and the X3D without latency issues will be a great choice and it will be a moot choice for a production rig or server rig where the extra cache is never used. If you need both, there's always option to get the 7900X3D or 7950X3D. if playable framerate is all that one requested? get a Zen 1 or 7700k with some decent GPU will do the trick, don't get anything new
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
A zen 1 or a 7700k will definitely not do the trick.

You cant buy a cpu based on 1 game unless that's the only game you are going to play. That's why you look at averages across multiple games. On average the 3d ain't worth it unless you plan to buy a 5090 and game at 1080p.