News AMD launches Ryzen 9 9950X3D and 9900X3D, claims 20% faster gaming performance than Intel’s flagship Arrow Lake processors

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
"middle-class developed countries"
I have nothing like that in my comment. "Middle class" of developed countries has a completely different meaning and you understood it perfectly.

Yes, primates (or otherwise social animals) will waste precious time on meaningless games - their main interest, because learning and studying new knowledge instead of doing something useful for civilization and to increase their own global competitiveness is an extremely energy-consuming and painful process for such people. But pay attention - most of the interest on such sites and forums is concentrated around the topic of games (empty consumption, but not their own development of intelligence), but not useful work on hardware and progress in this regard. It is I who usually try to emphasize work tasks and working hardware for such tasks, the most comfortable and productive, in all aspects, for people busy with work, and not senseless nonsense in the form of games, little connected with useful rest from intensive work.

If there are no mass, accessible (this is the key) powerful improvements in hardware and, accordingly, professional software, then creative capabilities progress little at the individual level. This is where the individual's "power-to-weight ratio" is. It is absolutely obvious that the main part of learning, knowledge occurs at a young age, mainly up to 30 years old, and it is the availability of powerful hardware to the average person, to inquisitive minds, that is extremely critical at this age. And after several decades of rapid progress and an outstripping inflation drop in prices for powerful hardware, there was first an obvious plateau in the growth of capabilities in hardware, and then an outright regression - a rise in prices outstripping inflation, because we are getting closer and closer to a silicon dead end and a halt in the growth of the individual's computing capabilities. The progress in hardware until 2010-11, approximately, was also facilitated by the fact that the masses were forced to buy PCs by the hundreds of millions (and now by the billions of pieces, which would have further reduced the average cost of developing new technologies) and therefore sponsored developments in this area, which gave young generations of professionals (schoolchildren, students) access to cheap but very powerful professional-grade hardware at the stage of their formation, which was due to falling and low prices for hardware. Which was interrupted after the majority of the population switched to smartphones - completely useless devices for creators and developing anything, since the majority of the population were and are banal consumers, and not creators, creators of something new and useful for civilization. Smartphones came in handy for them, but this practically destroyed the progress in x86 and in PCs/laptops in general, since the main interest of the public has shifted to another IT sector, completely useless in terms of efficiency in work, completely sharpened for content consumption.

That is why everything has taken exactly such a deplorable state by 2025. New layers of tasks are simply impossible to solve with such falling rates of growth of productivity and hardware capabilities year after year.

Any real heavy tasks will easily absorb any productivity and any number of cores and bandwidth, as well as the capacity of storage devices and memory, because it has long been insufficient (by orders of magnitude) for the new tasks put on the agenda.

And this whole scam with "AI" will soon burst like a soap bubble (as the scam with autopilots burst, requiring the presence of real AI for real work - which is impossible at such a household level in a car, in the next 50 years, at least), because it is obvious that with such energy costs all this has no prospects for mass use with at least some useful exhaust for societies. New qualitative changes require qualitative fundamental changes in technologies. They do not exist yet, we still use the Von Neumann architecture that is almost a hundred years old and we still sit on silicon with some variations.
 
The 9800x3d showed a chip that was near parity with the 9700x in none gaming purposes. Some reviewers put this down to the construction of the package with the cpu part against the heat spreader whereas the 7800x3d and 5800x3d had the cache as a nice warm blanket.

If the x3d ccd within the 9950x3d package can maintain clocks similarly to the 9800x3d then the clock rate deficit seen by the 7x00 / 7950x3d will be minimised. The 9950x3d should perform similarly to the 9950x in none gaming applications.

Who is it for? Someone with work to do, that doesn’t need a threadripper/epyc and wants to have some gaming flex in the quiet times?
 
We may not be far apart in opinions, perhaps more in analysis. But this no longer technical, more a beer talk.

For me IoT was the first big IT bubble I took a closer look at. But the need to somehow balance the overshot of productivity enabled by human ingenuity has always produced interesting results.

IMHO pyramids were the first big scam, so this has been with us for a while.
 
They already don't. If someone is making their CPU choice based on gaming they're a fool falling for marketing and nothing more. There's no real world difference. Benchmarks have clearly shown that at a 2K to 4K+ gaming that even a half decade old i5 only has single digit differences in framerates. Who's gaming at 1080p where it actually makes a difference? Oh and only with high end high refresh monitors and GPU's, there's only a tiny niche of competitive gaming where that's a thing so it doesn't apply to the vast majority. If someone cares about their gaming performance they focus on the GPU, simple as that, and a fast drive for loading.
This completely depends on the game. Some games demand lot of cpu like simulation games and strategy.
Some games are also poorly optimized on the cpu part or the graphics require cpu to assemple a lot of data.
 
The only reason I would want the cache to be on both CCDs is that it would eliminate the risk of a process being executed on the wrong CCD. Since having the cache under the processor allows the processor to run at the same speed as a non cached variant, having the cache on both CCDs would make every core fully and 100% interchangeable and there would not be a need to schedule threads.

Anyways, I am hoping they figure out how to add the cache across multiple CCD chips and make it work in the future. Either that, or make CCDs with more than 8 cores.
This has been a pretty false premise based on statements made by people with poor memories. All Dual-CCD CPUs have had an inter-CCD latency issue if game threads switch CCDs. It was talked about with the 5950X in reviews, and led to the 5800X occasionally having more consistent frametimes in games. Its Windows that wants to load balance threads between highest clocking cores and just treats it like it should be a monolithic CPU. Since it isn't, that next best core is sometimes on the other CCD. However the opposite CCD isn't populated with relevant data in the instant a thread gets bumped to it.

That doesn't go away by putting v-cache on both. AMD ultimately started using the CCD driver on non-X3D Ryzen 9 CPUs with Zen 5, because it's always been needed in either case. Since MS isn't going to code NUMA like behavior into the client kernel for a relative handful of gamers with a dense AMD cpu, software is the solution. You actually don't even have to rely on the Xbox game bar, as you can hardcode a desired affinity into the shortcut of any program.
 
The memory gap isn't x86 specific, either, as twelve channel DDR5-6400 RAM on EPYCs will prove: you can buy a ton of bandwidth on x86, but very few seem ready to pay the price. It isn't easy to push bandwidth, "normal" DDR, HBM, GDDR or on-die-carrier LPDDDR stacks each owe their existance to different compromises around capacity, bandwidth, latency and price being a better fit for a given use case, none will fit all.
just wanted to drop by and say this is wonderfully correct and made me stop writing a reply halfway.
 
  • Like
Reactions: abufrejoval