News Intel lays off hundreds of engineers in California, including chip design engineers and architects — automotive chip division also gets the axe

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
No they aren't 🤣🤣🤣
Most of this is not a serious reply, so I will not take it seriously.

Intel nailed their roadmap:
intel_amd_2021_22_roadmap.jpg
...except for this point, which was very disingenuous. We were talking about process nodes and you pivoted to something completely different.

Also, that product roadmap leaves out some very high-profile misses that were very costly for Intel:
  • Sapphire Rapids was years late, even according to their server roadmap circa 2020.
  • Ponte Vecchio was so late it cost them hundreds of $M in penalties on their big government HPC contract.
  • Meteor Lake-S was a complete no-show.
 
You're more pessimistic than I am, but I guess we'll just have to wait and see.

I personally dont see anything to get excited about with either Intel or AMD theses days !!

Intel has falied time and time again with terrible socket life power hungry , brute forcing charts and little to no gains each gen ..

Intel should have gone X3d after the 5800x3d was released ..

AMD also annoys me because i see the writing on the wall of the Nvidia curse own the market ( CPU's ) and screw over consumers ..

AMD are owning near on every sector in the CPU market servers , handhelds , consoles , desktop !!

If we lose intel to ( granted of their own doing ) just being crushed out of the market completely mark my word AMD will be the next Nvidia of the CPU market ..

I was disappointed that the 9950x3d wasnt running dual cache stacking ccds if 10k series fixes that and it smashes in both production and games , seriously im sorry but intel are finished .

If Intel dont come back big bold and better than AMD next gen their going to be AMD's little B..tch for along time ..

Not unlike AMD is VS Nvidia !!

all these rumours of UDNA crushing Nvidia i dont see it ..

Nvidia knows the 6090 will be more expensive and beat anything AMD throws at them ..

Nvidia is so confident that their IDIOT consumers will literally buy what ever crap they spew out they still over price and under Vram their products and customers still buy it..

rumours of the 5080 super getting 20gb Vram the 5080 SHOULD of had 20gb from the start !!

AMD not as bad but still anything 8gb in 2025 is disgusting so the 9060xt 8gb is a bad money grabbing product !!
 
Last edited:
I personally dont see anything to get excited about with either Intel or AMD theses days !!
I'll grant you that. AMD really wow'd us with the ascent of Zen and their 3D VCache. Going forward, I think the story for AMD is going to be more like what we saw with Zen 5, which is some incremental improvements, but those "wow" moments for them are pretty much done.

Even wide memory interfaces, like Strix Halo (i.e. Ryzen AI Max), don't seem to have a lot of potential for improving low-to-medium threaded compute workloads.

Intel has falied time and time again with terrible socket life power hungry , brute forcing charts and little to no gains each gen ..
Arrow Lake is a little better on power, but the downside is that (refresh aside) it's effectively a single-generation socket.

For Intel, what I'm looking forward to is APX and the return of AVX-512. I know E-cores aren't very popular, but I credit them with making a bold move of bringing them to the desktop, and then for Skymont improving them to the point where I'm increasingly interested in E-only CPUs.

What they got wrong was thinking their Thread Director (hardware engine) could be the solution to the scheduling complexities introduced by hybrid CPUs. I wish they'd have worked with Microsoft and the Linux kernel folks to introduce some threading API improvements, so that apps could take a more active role in telling the OS which threads are latency-sensitive.

Intel should have gone X3d after the 5800x3d was released ..
That sort of thing takes a while, so it's not reasonable to expect Intel to respond immediately, or even in the next generation (but probably by Arrow Lake, they could've). People have even found evidence on Zen 2 of AMD experimenting with it, way back then. And it took them until Zen 5 to finally crack the heat issues, so that the X3D dies didn't experience such a tradeoff on clock speeds.

AMD also annoys me because i see the writing on the wall of the Nvidia curse own the market ( CPU's ) and screw over consumers ..
AMD is going to bring their own ARM-based CPUs, early next year. Soundwave. It should be interesting. Not as much as if it were based on Zen 6, but still almost as interesting as what Nvidia and Mediatek are doing.

If we lose intel to ( granted of their own doing ) just being crushed out of the market completely mark my word AMD will be the next Nvidia of the CPU market ..
Intel won't go away. Maybe their fabs and maybe even x86, but you can be sure that Intel will be designing CPUs for the foreseeable future.

I was disappointed that the 9950x3d wasnt running dual cache stacking ccds
Same. I think they should've at least offered a 9955X3D which had dual 3D cache. Their argument was simply that the value for money wasn't there, but I think some people still would've paid the premium for the few % more performance of dual 3D cache.

all these rumours of UDNA crushing Nvidia i dont see it ..
Don't listen to that nonsense. It's powered by nothing but fanboy wishes. At best, UDNA will be a decent follow-on to RDNA4, which I think was a pleasant surprise. I just bought one, yesterday.

No, my default assumption is that Nvidia will continue to hold the GPU crown. It's clear that RTX 5000 is being held back by TSMC N4. Their next generation will be on a new node and should therefore have legs to reach the next tier.

rumours of the 5080 super getting 20gb Vram the 5080 SHOULD of had 20gb from the start !!
20? How would that work? I'd put it at 24 GB or at least 21 GB.

AMD not as bad but still anything 8gb in 2025 is disgusting so the 9060xt 8gb is a bad money grabbing product !!
It wouldn't be bad if it were cheap enough, but I agree that it's too expensive to have only 8 GB.
 
  • Like
Reactions: ilukey77
Intel won't go away. Maybe their fabs and maybe even x86, but you can be sure that Intel will be designing CPUs for the foreseeable future.
if they are just doing CPU's yeah maybe they might stay around ..

But we need them to be competitive against AMD from a consumer stand point ,,

Im just not sure they can changing socket life so often hurts them when their competition is giving ( what is it now near on 4 years and still a gen to go )

I agree with X3d still think they should have jumped after the 5800x3d and sent something to market soon after but X3d as a whole is very incremental now ( what needs to be perfected is the top end work/game CPUs like the 9950x3d )

Intel GPU's have me interested , i bought the ARC770 16gb and was quite impressed with it from the first time stand point .

Their dedication to constantly fixing drivers .. the easy OC software .. 16gb !!! :)

I built a all intel build using the 14600kf and the Intel ARC 770 and for the most part i was impressed what it could run and the fps i was getting in games ..

( on a side note i was also impressed with the 14600kf's cinebench r23 all core score and its stayed quite cool using the Noctua NH U-12s single tower cooler with push pull config )

while not as good as my 7800x3d and 7900xtx obviously i was hitting 60 fps at 4k in some of my games suite..

I really want intel to stay with GPU's and we need them to compete with AMD in both areas CPU and GPU !!
 
Last edited:
if they are just doing CPU's yeah maybe they might stay around ..

But we need them to be competitive against AMD from a consumer stand point ,,
If ARM gains more marketshare, we might not need Intel to serve that role. Qualcomm and Mediatek will make it at least a 3-way race. If Intel gets into the ARM game, then we're talking about 4 potentially viable options.

Intel GPU's have me interested , i bought the ARC770 16gb and was quite impressed with it from the first time stand point .
Me too, but I didn't go that far. I was interested in a B770. Looks like it's been canceled, but then there have been rumors it's coming out, after all. Maybe Lip's 50% margins policy killed it, again.

I really want intel to stay with GPU's and we need them to compete with AMD in both areas CPU and GPU !!
Agreed.