News Intel Will Adopt 3D Stacked Cache for CPUs, Says CEO Pat Gelsinger

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
That's just what I mean. Who says that doing it that way will be better?
The cores may stay cooler but if the cache gets to hot it's throughput will go down.
Maybe it's an idea to have all connections on the side and cool the top and bottom part of a CPU with stacked silicon. Just a crazy idea.
Sure but that would still leave the CPU running at the same speed as previous generations and the cache performance will just be lower than it could be.
Since cache only has benefits in some things while cores have a benefit always it still seems like the better option.
At least in theory, until we get to see some benches it's all just theoretical.
 
  • Like
Reactions: Ravestein NL

Xajel

Distinguished
Oct 22, 2006
170
10
18,685
AMD didn't have anything to do with developing the 3d stacking technology behind 3D Vcache. It was developed by TSMC and AMD just used it. What should AMD get credit for?

TSMC's announcement of 3D stacking from 2018:


Any mention of AMD in that? AMD announced 3D VCache in 2021.

Actually, AMD do have things.. I don't know the details but AMD collaborated with TSMC to develop some sort of their own version or ideas and they have patents in this regard. They have to collaborate with TSMC because the later have packaging technologies and expertise that AMD lacks.

AMD's work with TSV stacking goes back to 2016 in their collaboration with Samsung and others with HBM, but they have TSMC/3D V-Cache patents in 2019, so they were working on it with TSMC way before they announced the first product in 2021, and I've read some where that the TSMC-AMD collaboration goes even before TSMC's announcement.

AMD patents does mean they own the tech. or they invented it, but they solved some issues and thought of some ideas, so you can say its their own implementation of the tech, and they still do, they have announced second gen 3D V-Cache and also have patents on cooling the stacked cache as well.

Intel can have TSMC tech, but they can't have the AMD specific solutions without licensing them (if they actually need them), and as the article said, intel is developing their own version as well, maybe these will be different than TSMC's own solutions, or maybe similar with intel's own touch (implementation).

What I'm saying is, that the statement of AMD doesn't have anything to do with that is wrong, they have but not everything.
 
  • Like
Reactions: Order 66

TJ Hooker

Titan
Ambassador
AMD's work with TSV stacking goes back to 2016 in their collaboration with Samsung and others with HBM, but they have TSMC/3D V-Cache patents in 2019, so they were working on it with TSMC way before they announced the first product in 2021, and I've read some where that the TSMC-AMD collaboration goes even before TSMC's announcement.

AMD patents does mean they own the tech. or they invented it, but they solved some issues and thought of some ideas, so you can say its their own implementation of the tech
Can you provide links to those patents (or coverage thereof)?
 

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
AMD very much designed the Chip-On-Wafer when they were trying to mount memory close to the GPU! They worked with tsmc to make sure it could happen.
According to who? I can't find any reference to AMD designing chip on wafer. All the information points to TSMC:


"TSMC's CoW (Chip-on-Wafer) and WoW(Wafer-on-Wafer) technologies allow the stacking of both similar and dissimilar dies, greatly improving inter-chip interconnect density while reducing a product's form factor."

TSMC developed CoWoS in 2012. When did AMD develop their CoW?
 
  • Like
Reactions: TJ Hooker

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
Chip-on-wafer was designed by AMD and implemented by TSMC on 3d cache cpus.
I'm not finding anything that corroborates this. TSMC has been using chip on wafer since at least 2012 when they developed CoWoS with Altera.


TSMC’s integrated CoWoS process provides semiconductor companies developing 3D ICs an end-to-end solution that includes the front-end manufacturing process as well as back-end assembly and test solutions...CoWoS is an integrated process technology that attaches device silicon chips to a wafer through a chip on wafer (CoW) bonding process.
 
  • Like
Reactions: TJ Hooker

ilukey77

Reputable
Jan 30, 2021
833
339
5,290
now Intel just need to show better socket life and they have a a real competitor if they are going to finally 3d stack cache ..

I just hope for there sake they can also make their cpu's more efficient because as of now they hold the gaming production crown with the 13900k but its one hot little cpu to take the crown !!

i may be converted if they extend their socket life ive always wanted to try high end intel but just never seen the point with a 2 year socket turn around !!
 

user7007

Commendable
Mar 9, 2022
45
33
1,560
Now that you mention it... given how long Intel has been banging on about Foveros, it's rather surprising that AMD managed to release at least two generations of 3D V-Cache before Intel has even managed one!

I wouldn't be surprised if AMD is on their 3rd gen V-Cache, by the time Intel has anything comparable on the market. All of that experience will hopefully serve AMD well, in terms of things like negotiating the apparent thermal issues, etc.


So, you're essentially betting that Intel knocks it out of the park, on their first time at bat? If you just look at the amount of improvement between Alder Lake and Raptor Lake, you can see that even the mighty Intel doesn't get everything perfect on the first try! And Alder Lake wasn't even their first hybrid CPU - that distinction belongs to Lakemont!
I think because we're talking about a large external cache intel has a good shot an the initial implementation being as good or better than AMDs. Intel has lots of experience with different caches and layouts. We'll see of course
 
  • Like
Reactions: Order 66

everettfsargent

Honorable
Oct 13, 2017
130
35
10,610
So to fully understand this so-called marketing PR BS blitz, how far behind the market leaders, that being TSMC/AMD/Apple (heck anybody with Intel NOT in their company name :unsure: ), on 3D stacking, on tiles, is Intel now? Most honest people would say something like up to a full half decade.

As to so-called yield issues, can't really compare a cpu tile versus a full die, since we don't know which parts of the previous full die CPU's had the majority of the yield issues to begin with in the 1st place (unless you work for Intel directly on said yield issues). Heck, some would surmise that tiles were first used to improve yield issues to begin with in the 1st place!
 

vertuallinsanity

Prominent
May 11, 2022
34
14
535
It isn't worth all that much in the grand scheme of things. Having good ideas is a universe away from producing a commercially viable product. AMD dumped their fabs when they couldn't afford them any more. Then GloFo ended up quitting leading edge node development when they couldn't get 7nm right. If GloFo didn't let AMD out their contracts at that point, AMD would be dead and buried. Without TSMC storming into the lead, AMD isn't where they are today. Without TSMC developing 3d stacking, AMD doesn't have 3D V cache. They would not have come up with that on their own, and they obviously couldn't manufacture it with GloFo.

You might want to take a look around your city, apt, house, garage, etc.

Can you and I still buy AMD products designed by AMD? Can you and I buy cars designed one place and then assembled somewhere else with parts from a host of different suppliers?

Are you familiar with the design and implementation of a modern jet or, say, the international space station?

"If it ain't all from Airbus it's crap"

Lol...Okay, bro!
 

waltc3

Honorable
Aug 4, 2019
454
252
11,060
"Challenging AMD" is one way to look at it--but "copying AMD" is the way I prefer to think about it...;) And good for Intel--glad they are finally in the black for a while! I'm very happy that AMD has given Intel a solid direction in which to go!
 

richardvday

Distinguished
Sep 23, 2017
189
33
18,740
Intel has been copying Amd for a long time. First L3 cache ? Amd K6-3 anyone ?
Memory controller on CPU ? Amd
First Dual Core CPU ? Amd
x86_64 ? Amd
Amd used to make higher clocked chips a lot too back in the day.
am386 40mhz
They made a 486dlc40 that slotted into a 386 board(Up until after Socket-7 they maintained socket compatibility)
I had a am486x133 that was nominally clocked at 133mhz but could be overclocked to 160mhz
AMD's problem has always been capitalization. As in not enough to compete with Intel. Looks like that's not a problem anymore.
 

bit_user

Titan
Ambassador
Intel has been copying Amd for a long time. First L3 cache ? Amd K6-3 anyone ?
According to this, the DEC Alpha 21164 had L3 cache in 1995: https://en.wikipedia.org/wiki/CPU_cache#Multi-level_caches

Memory controller on CPU ? Amd
Somehow, I doubt they were first to do that, either. However, a quick search didn't turn up anything earlier.

I think the two reasons Intel didn't were:
  1. Keeping it in the Northbridge made RAMBUS RDRAM-support a motherboard-specific thing (i.e. rather than CPU-specific, as might've been the case if it the memory controller were integrated).
  2. In multiprocessor systems, it avoided NUMA-related complexity and bottlenecks. Perhaps Intel wasn't ready to take on NUMA-issues, at that point in time, and thought their Northbridge-based solution wasn't too much of a bottleneck?

One thing I liked about the memory controller being in the Northbridge is that you could pair ECC memory with any CPU, as long as your motherboard supported it.

First Dual Core CPU ? Amd
I'm sure there must've been mainframe CPUs with dual-core.

Back in 2000, I know there was a company building multi-core MIPS-based communications processors.

Anyway, from what I can tell, it seems like Pentium D and Athlon 64 X2 launched within weeks of each other. According to Wikipedia, Pentium D launched on May 25, while Athon 64 X2 launched on May 31, 2005. I'd consider that effectively a tie.

x86_64 ? Amd
MMX? Intel. It took AMD a couple years to counter with "3DNow!", which Intel topped with SSE.

Amd used to make higher clocked chips a lot too back in the day.
am386 40mhz
I'm pretty sure I had an AMD 486DX3-120. The funny thing about it is that it ran the PCI bus at 40 MHz, instead of the normal 33 MHz. My family had a Pentium-75, which did the opposite of running it at a mere 25 MHz.
 
Last edited:

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
Intel has been copying Amd for a long time. First L3 cache ? Amd K6-3 anyone ?
Memory controller on CPU ? Amd
First Dual Core CPU ? Amd
x86_64 ? Amd
Amd used to make higher clocked chips a lot too back in the day.
am386 40mhz
They made a 486dlc40 that slotted into a 386 board(Up until after Socket-7 they maintained socket compatibility)
I had a am486x133 that was nominally clocked at 133mhz but could be overclocked to 160mhz
AMD's problem has always been capitalization. As in not enough to compete with Intel. Looks like that's not a problem anymore.
Typical AMD fanboy take, AMD invented everything, even if they really didn't. It seems that AMD followers are unaware that there have ever been computers that didn't use x86 and they believe if AMD released anything before Intel, then AMD must have invented it because no one besides Intel makes CPU's according to them.

IBM introduced L3 cache in the 1980's

Both Dec Alpha's and HP CPU's had integrated memory controllers in the 1990's.

First to dual core? This claim baffles me, as AMD was literally dead last in the industry to dual core. IBM was first, followed by Dec Alpha and HP. AMD wasn't even first to market with an x86 dual core. Intel beat them to market with the Pentium D.

x86_64 was developed by AMD as a desperation move to save the company when Intel started their transition to IA64 which they weren't legally required to license to AMD like they were with x86

am386 was released in 1991. That same year MIPS was selling a 100Mhz CPU. The next year, Alpha reached 200MHz.
 

bit_user

Titan
Ambassador
Can you provide links to those patents (or coverage thereof)?
Took me 30 seconds to find this one, filed in Dec, 2012:

 

ilukey77

Reputable
Jan 30, 2021
833
339
5,290
in actual fact i think AMD started off reverse engineering there competitors cpu's ..

I mean maybe not a glamourous start but hats off to them for just that ..

Hell china has been reverse engineering everything for the better part of 30 years and now they have the skills to make there own stuff and make it great !!
 
So to fully understand this so-called marketing PR BS blitz, how far behind the market leaders, that being TSMC/AMD/Apple (heck anybody with Intel NOT in their company name :unsure: ), on 3D stacking, on tiles, is Intel now? Most honest people would say something like up to a full half decade.

As to so-called yield issues, can't really compare a cpu tile versus a full die, since we don't know which parts of the previous full die CPU's had the majority of the yield issues to begin with in the 1st place (unless you work for Intel directly on said yield issues). Heck, some would surmise that tiles were first used to improve yield issues to begin with in the 1st place!
Companies only care about having the leading edge technology if they need it as a gimmick to sell they product.
Intel has more than 80% market share on all of their divisions, increasing their production costs for no reason would be a terrible business decision.
Just as you need inside knowledge for the yields you would also need insider info on how far along they are on tiles and 3d stacking. Judging from xeon cpu/gpu max they are doing pretty well at least on the tile front.

Also if tiles are supposed to increase yields then why are you so salty about tiles increasing yields?! It doesn't make any sense.
Ok, it's not a world shattering news to be reported on on its own but it's only a very small part of this whole presentation.
"Challenging AMD" is one way to look at it--but "copying AMD" is the way I prefer to think about it...;) And good for Intel--glad they are finally in the black for a while! I'm very happy that AMD has given Intel a solid direction in which to go!
Huh?!
Intel Annual Net Income
(Millions of US $)
2022$8,014
2021$19,868
2020$20,899
2019$21,048
2018$21,053
2017$9,601
2016$10,316
2015$11,420
2014$11,704
2013$9,620
2012$11,005
2011$12,942
2010$11,464
2009$4,369
 

JayNor

Honorable
May 31, 2019
458
103
10,860
AMD admitedly couldn't break down the gpu compute tiles due to the routing limitations of the organic substrates.

AMD reportedly moved to a silicon substrate on MI300, using CoWoS-S.
 

bit_user

Titan
Ambassador
Typical AMD fanboy take,
Let's not be overly partisan. Your point was adequately made through your corrections, rendering such generalizations unnecessary. Such attacks put people on the defensive and draw a line in the sand, often lowering the quality of the discourse (i.e. it's flame bait).

The core issue - and I don't think it's is a partisan one - is people not fact-checking themselves, especially when making sweeping claims. If we simply make a point of trying to cite references or include specifics, then fact-checking tends to come as a byproduct (though, beware of confirmation bias).
 
Status
Not open for further replies.