News Intel ships Nova Lake CPUs to partners for testing — samples intended for validation and research

Why would they move the memory controller off tile? They still are planning no vcache equivalent. Less cache than AMD with continued slower access. I understand cache costs money, but if you keep making vastly inferior products, you will be out of business because no one will buy your products. CRAZY.
 
Why would they move the memory controller off tile? They still are planning no vcache equivalent. Less cache than AMD with continued slower access. I understand cache costs money, but if you keep making vastly inferior products, you will be out of business because no one will buy your products. CRAZY.
AMD's multi-chiplet CPUs have the memory controller off of the CPU chiplet, and Intel's tile architecture should be lower latency thanks to the base tile. On desktop Intel's solution is superior (evidently held back by other architectural details), but it's inferior to the monolithic chip AMD uses for mobile use. So why do it on mobile? For the same reason as before; better yields, lower cost, and earlier time to market, which helps to make up for Intel Foundry being behind.
 
Why would they move the memory controller off tile? They still are planning no vcache equivalent. Less cache than AMD with continued slower access. I understand cache costs money, but if you keep making vastly inferior products, you will be out of business because no one will buy your products. CRAZY.
The reason AMD doesn't put V-Cache on all CPUs isn't due to cost. Large caches can increase latency due to cache searches depending on the application. V-Cache works great for specific use cases where there is a high probability the content is in cache. However, for applications where cache misses are more prevalent it can be a hinderance if the miss occurs frequently. There are some other factors as well, such as cooling overhead, but cost isn't really the driver since we are only taking 10s of dollars.

Why Intel doesn't have their own "gaming" equivalent with a large cache is a question only Intel can answer. Does seem odd they would continue to seed gaming wins to AMD without some kind of an answer.
 
There are some other factors as well, such as cooling overhead, but cost isn't really the driver since we are only taking 10s of dollars.
Cost is absolutely the driver as the advanced packaging costs a lot more money and is another failure point. AMD has come out and stated that the reason they don't do cache on both CCDs for the two CCD parts is the cost vs benefit.
Why Intel doesn't have their own "gaming" equivalent with a large cache is a question only Intel can answer. Does seem odd they would continue to seed gaming wins to AMD without some kind of an answer.
It's actually super simple and straightforward: cost.

Intel's designs until MTL were monolithic which means to add cache would require spinning up a separate die. Even now with tiles it wouldn't necessarily make financial sense as they would still have a separate die though the cost wouldn't be as bad as prior designs. CWF is using a design with cache in the base tile which might be viable to cross over to desktop, but this is the first Intel design hinting at anything cost effective for adding cache.

A lot of people also overestimate the retail market and X3D sales. I'd be surprised if if Intel didn't sell more desktop CPUs to Dell per year than AMD sells period. I'm absolutely certain the engineers at Intel would love to put together something faster, and marketing would love an easy win. That doesn't necessarily equate to it making financial sense to do so.
 
  • Like
Reactions: KyaraM
I'd be surprised if if Intel didn't sell more desktop CPUs to Dell per year than AMD sells period. I'm absolutely certain the engineers at Intel would love to put together something faster, and marketing would love an easy win. That doesn't necessarily equate to it making financial sense to do so.
Somebody missed every article about market share in client systems over the last 3 years. Intel is no longer even selling more server chips than AMD.
 
AMD has come out and stated that the reason they don't do cache on both CCDs for the two CCD parts is the cost vs benefit.
That is for the due CCD, not CPUs as a whole. This is conflating two different things. Putting VCache on two CCDs probably doesn't provide much of a boost, thus cost to performance benefit isn't' worth it. However, as I mentioned, the price difference is small (10s of dollars), if it made a difference for every workload you better believe AMD would only make x3d series chips.
 
The reason AMD doesn't put V-Cache on all CPUs isn't due to cost. Large caches can increase latency due to cache searches depending on the application. V-Cache works great for specific use cases where there is a high probability the content is in cache. However, for applications where cache misses are more prevalent it can be a hinderance if the miss occurs frequently. There are some other factors as well, such as cooling overhead, but cost isn't really the driver since we are only taking 10s of dollars.

Why Intel doesn't have their own "gaming" equivalent with a large cache is a question only Intel can answer. Does seem odd they would continue to seed gaming wins to AMD without some kind of an answer.
Cost isn't a significant difference but it does take more time to fully package V-Cache CPU's as well, something AMD acknowledged with their supply vs. demand constraints for the 9800X3D.

I don't know how Intel could just ignore the success of the 9800X3D at this point and more generally this technology; with AMD having X3D chips that start not much over $200, even with not all games being able to take advantage of them, X3D is just a superior gaming CPU technology today when everything is considered, i.e. also bringing in the perf/watt conversation. I can understand Intel waiting a generation on for their Local Cache to come to desktop, but is that really something that can't be confirmed today? IMO, they've stripped the business back TOO much, being overly hyperfocused on safe CPU bets.
 
  • Like
Reactions: JamesJones44
I don't know how Intel could just ignore the success of the 9800X3D at this point and more generally this technology
AMDs main line (non cache) CPU have taken a huge hit in sales because everybody want's the cache ones, which are more expensive to make and have lower margins.
Why would intel want that?!

Also the extreme PC buyer is a super small percentage, the average joe sees every CPU, including the entry level ones, hitting 100FPS+ in most games and so they don't care which has better FPS at over 9000 they buy whatever is cheap and readily available.
 
  • Like
Reactions: KyaraM
That is for the due CCD, not CPUs as a whole. This is conflating two different things. Putting VCache on two CCDs probably doesn't provide much of a boost, thus cost to performance benefit isn't' worth it.
If the cost was miniscule they'd do it to simplify scheduling and shut up all the loud whiners every time they release one without it.
However, as I mentioned, the price difference is small (10s of dollars), if it made a difference for every workload you better believe AMD would only make x3d series chips.
"10s of dollars" can mean anything care to provide a source for this assumption? The cache die silicon cost is low, but that doesn't take into account the packaging costs and additional failure rates. Last time I checked the armchair experts were guessing around $50 total cost add during Zen 4 which is not an insignificant material cost increase.
 
Last edited:
  • Like
Reactions: KyaraM
Im not sure why there are non x3d parts out there unless we are going to see less performance out of the 9950x3d than the stock 9950x then what is the point of non x3d parts ..

AMD are basically Nvidia of CPU's now they could raise the price of the x3ds and use them as the only cpus's they make ..

Intel have been stupid in not going x3d offering very very small generational upgrades and their socket life has been a joke at best ..

This is why Nvidia has basically ( in my opinion ) put AMD in there place .

AMD was starting to become a thorn in Nvidia's side with the 6900xt it was not far off 3090 in raster so everyone was of the belief that the next gen would see AMD make that 4090 beater competitor BUT no 4090 crushed AMD and sent them back to the mid tier low tier market ..

point is Nvidia didnt sit on it hands and wait for the competition to crush them like Intel has done.

Intel could of and should of made a smarter choice by making x3d , better more efficient cpu's and offer longer socket life span ..

you ask any gamer now what CPU they want ( even if they can or cant afford it ) its a x3d and i would bet most cpus sold are not for production most are sold for games

The ones who want production why not buy the higher core x3d best of both worlds at the same price !!
 
"10s of dollars" can mean anything care to provide a source for this assumption? The cache die silicon cost is low, but that doesn't take into account the packaging costs and additional failure rates. Last time I checked the armchair experts were guessing around $50 total cost add during Zen 4 which is not an insignificant material cost increase.
The x and x3d parts MSRP at identical prices (5800x3d/5800x both $499 (there is no 7800x to compare), 7950x/7950x3d both MSRP for $699). AMD has said in the past that v-cache cost slightly more. The point of using x3d is to dominate Intel in gaming performance, thus the take a tiny hit to base margin but make up for it in volume, but it's a tiny amount of cost, hence the reason the MSRP at the exact same price as the non-x3d chips.

At street price the x3d chips get bid up by gamers, but there is zero evidence that AMD doesn't apply this to its entire line due to cost constraint.
 
The x and x3d parts MSRP at identical prices (5800x3d/5800x both $499 (there is no 7800x to compare), 7950x/7950x3d both MSRP for $699). AMD has said in the past that v-cache cost slightly more. The point of using x3d is to dominate Intel in gaming performance, thus the take a tiny hit to base margin but make up for it in volume, but it's a tiny amount of cost, hence the reason the MSRP at the exact same price as the non-x3d chips.

At street price the x3d chips get bid up by gamers, but there is zero evidence that AMD doesn't apply this to its entire line due to cost constraint.
Do you have any source for AMD claiming "slightly" higher cost?

It mostly sounds like you're making up numbers based on MSRP when that doesn't particularly mean a whole lot:

The 5800X3D launched 17 months after the 5800X (they were both $449 MSRP, but that isn't particularly relevant) and the 5800X was selling for around $320 when the X3D launched.

The 7950X3D launched 3 months after the 7950X, but the important part is that the 7950X had already dropped to around $600. AMD rapidly dropped the 7950X price due to it being too high.

As for there being no "7800X" to compare we can ask what's in a name. The 7700X has 8 cores with higher base and boost clocks with a lower TDP. Its MSRP was $399, but much like the 7950X the price had dropped so it was around $350 by the 7800X3D (MSRP $449) launch.

These parts all carried around a $100 price difference throughout the time on market as prices went up and down (except for the shortage times).
 
  • Like
Reactions: KyaraM
These parts all carried around a $100 price difference throughout the time on market as prices went up and down (except for the shortage times).
The 9950x3d is only 5 months after the 9950x with the same MSRP. Also, street price and the price AMD sells to retails/wholesale are two entirely different things. NewEgg, Amazon, etc. can charge whatever they want based on supply demand, one can't base anything on that. Amazon would have happily sold a RTX 3900 for $500 more than what they paid Nvidia for it during the pandemic. That money doesn't go back to the producer, it stays with the retailer. I can't find any info on what AMD sells their CPUs at per 1000 or wholesale. Wholesale often has discounts of 20+% when purchasing in bulk which gives retailers wiggle room to adjust prices based on demand. 9950x being priced at $589 doesn't mean AMD has lowered the price (though possible), it could simply mean Amazon and others are lowering the price based on demand, not on cost.

Even if it was argued that with 5 months and better yield the cost to AMD is reduced by $50 and selling the 9950x3d at the same base price as the 9950x didn't affect AMDs margins, it still doesn't explain why AMD wouldn't just release an entire line of x3d processors since the performance would warrant a $50 higher price tag. Intel did this for years without issue because at one time Intel beat AMD by a lot in cpu performance. People are lining up to buy the RTX 5900 at $400 more for an assumed 30% performance boost. Even at the low end the RTX 4060 which is $50 more base than the last generation sold/sells well and is high on the stream graphic card list because it delivers 20% to 30% more performance than the RTX 3060. I don't think people would skuff at $50 when it comes to performance.
 
The 9950x3d is only 5 months after the 9950x with the same MSRP. Also, street price and the price AMD sells to retails/wholesale are two entirely different things. NewEgg, Amazon, etc. can charge whatever they want based on supply demand, one can't base anything on that. Amazon would have happily sold a RTX 3900 for $500 more than what they paid Nvidia for it during the pandemic. That money doesn't go back to the producer, it stays with the retailer. I can't find any info on what AMD sells their CPUs at per 1000 or wholesale. Wholesale often has discounts of 20+% when purchasing in bulk which gives retailers wiggle room to adjust prices based on demand. 9950x being priced at $589 doesn't mean AMD has lowered the price (though possible), it could simply mean Amazon and others are lowering the price based on demand, not on cost.

Even if it was argued that with 5 months and better yield the cost to AMD is reduced by $50 and selling the 9950x3d at the same base price as the 9950x didn't affect AMDs margins, it still doesn't explain why AMD wouldn't just release an entire line of x3d processors since the performance would warrant a $50 higher price tag. Intel did this for years without issue because at one time Intel beat AMD by a lot in cpu performance. People are lining up to buy the RTX 5900 at $400 more for an assumed 30% performance boost. Even at the low end the RTX 4060 which is $50 more base than the last generation sold/sells well and is high on the stream graphic card list because it delivers 20% to 30% more performance than the RTX 3060. I don't think people would skuff at $50 when it comes to performance.
You still haven't addressed where any of your cost evidence comes from, so I'm guessing you have zero. No reason to continue going back and forth without it. Also there is no pricing on the 9950X3D yet so there's that too...

edit: just to add: I sure hope you don't think all the early price cuts have nothing to do with AMD and are rather driven by retailers.
 
Last edited:
Somebody missed every article about market share in client systems over the last 3 years. Intel is no longer even selling more server chips than AMD.
Server chips are a different market section and hardly relevant if you want to discuss the client market. As for the rest, lol. I think the one who didn't hear the news is you, just that the news really aren't new anymore; most systems out there are still Intel, it's been this way for many years now.

Just look at the business system store websites of Dell, HP, Lenovo, Asus etc. Dell has 100% Intel offers. Over at HP, not even 30% of the systems offered are AMD; or to be more precise, from 108 offerings 77 are Intel, 29 AMD, and two Qualcomm. Lenovo is also mostly Intel; Asus, again, is 100% Intel, at least from the systems on their website. Annoyingly, their own store is crap to navigate.

But the real news to you seems to be that this is far and large the biggest sales section, followed by gaming pre-builts as a distant second, which, oh, is also Intel-dominated, though not as badly as in the past. Just looking around the office, I can tell you that every single system is Intel, no exception, and it has been that way ever since my first real job. I find it a bit cute when people here seem to think the DYI market is so large and important, when in truth it is only a fraction of a fraction of the market.

And also that people seem to think that just because a CPU is called the best for X it is more or less the only one sold. Most people don't have the money for that, and will settle for a lower-end model. As someone above said, not everyone is obsessed with having the highest FPS, or understands what 1% lows are; plus, you can widely read that the CPU is less important than the GPU for gaming, so many will also focus on that more, especially on a budget. And just for your information, that's not what I claim, but what the places where people look claim:
https://www.google.com/search?q=wha...u or gpu&ie=utf-8&oe=utf-8&client=firefox-b-m
So why buy a highly priced CPU if everyone says one for half the money is good enough? Especially paired with a low/midrange card?

I think that some people on this website really need to look around past their own bubble. The world is a bit more complicated and nuanced than many here seem to think it is.
 
Server chips are a different market section and hardly relevant if you want to discuss the client market. As for the rest, lol. I think the one who didn't hear the news is you, just that the news really aren't new anymore; most systems out there are still Intel, it's been this way for many years now.

Just look at the business system store websites of Dell, HP, Lenovo, Asus etc. Dell has 100% Intel offers. Over at HP, not even 30% of the systems offered are AMD; or to be more precise, from 108 offerings 77 are Intel, 29 AMD, and two Qualcomm. Lenovo is also mostly Intel; Asus, again, is 100% Intel, at least from the systems on their website. Annoyingly, their own store is crap to navigate.

But the real news to you seems to be that this is far and large the biggest sales section, followed by gaming pre-builts as a distant second, which, oh, is also Intel-dominated, though not as badly as in the past. Just looking around the office, I can tell you that every single system is Intel, no exception, and it has been that way ever since my first real job. I find it a bit cute when people here seem to think the DYI market is so large and important, when in truth it is only a fraction of a fraction of the market.

And also that people seem to think that just because a CPU is called the best for X it is more or less the only one sold. Most people don't have the money for that, and will settle for a lower-end model. As someone above said, not everyone is obsessed with having the highest FPS, or understands what 1% lows are; plus, you can widely read that the CPU is less important than the GPU for gaming, so many will also focus on that more, especially on a budget. And just for your information, that's not what I claim, but what the places where people look claim:
https://www.google.com/search?q=what is more important for gaming cpu or gpu&ie=utf-8&oe=utf-8&client=firefox-b-m
So why buy a highly priced CPU if everyone says one for half the money is good enough? Especially paired with a low/midrange card?

I think that some people on this website really need to look around past their own bubble. The world is a bit more complicated and nuanced than many here seem to think it is.
Who cares that Intel has a ton of legacy products they have to support? They’re bleeding money while AMD is raking it in.
 
So..... Intel is sticking to the mediocre "architecture" of the silly performance and 2eficiency" cores hahahahahahaha

This is why is no news that INtel will b bought by someone else very soon and hopefully take another direction instead this mediocre approach.
 
Yeah that's not CPUs outselling that's revenue within a business group. AMD's accelerator business is significantly ahead of Intel's which is of no surprise to anyone.
Accelerator business ? I’m talking about server CPUs. That article says AMD outsold Intel in the datacenter space and it damn sure wasn’t because of MI300 that nobody buys.