News Intel Can Regain Logic Technology Lead by 2025, Says IC Knowledge President

JamesJones44

Prominent
Jan 22, 2021
130
80
660
0
As long as you get some diversity in manufacturing, it's never good once you get down to one or two suppliers of anything. I wouldn't want Intel, Samsung, etc. to just replace TSMC in manufacture of leading edge nodes if they do somehow pass TSMC.
 

Lorien Silmaril

Distinguished
Jul 18, 2014
16
16
18,525
1
Intel daydreaming again? which company has the track record of over-promising and under-delivering?

one thing is for sure, Pat is an Intel lifer and fits right into that culture.
 
Reactions: Makaveli

jkflipflop98

Distinguished
Feb 3, 2006
1,698
142
19,970
3
As long as you get some diversity in manufacturing, it's never good once you get down to one or two suppliers of anything. I wouldn't want Intel, Samsung, etc. to just replace TSMC in manufacture of leading edge nodes if they do somehow pass TSMC.
That's basically where we're heading. The barrier to entry of the business is so staggeringly high that it's pretty much guaranteed there won't be any new players. Apple is just about the only company in the world that has the resources needed to start a leading-edge chip fabrication business from scratch.
 

Don Frenser

Commendable
Mar 29, 2020
29
9
1,535
0
That's basically where we're heading. The barrier to entry of the business is so staggeringly high that it's pretty much guaranteed there won't be any new players. Apple is just about the only company in the world that has the resources needed to start a leading-edge chip fabrication business from scratch.
And just maybe we will head towards photonic chips, blowing all 3 above out of the water. And another possibility could be quantum computing.
 

Jimbojan

Honorable
May 17, 2017
42
11
10,535
0
I believe in Intel. Chips made by Intel are already performed better than either AMD or NVDA now. If its manufacturing process gets ahead than TSMC, Intel will be the king of all. America always wins, fyi.
 
Reactions: jkflipflop98
Jan 3, 2022
63
13
35
0
Tbh, for me performance isn't number one. Higher chip density, low TDP and efficiency is more important that clock speed for various apps.
 

Liquidrider

Honorable
Nov 25, 2016
17
2
10,515
0
It actually makes me sick of how much taxpayer money Intel has received even BEFORE the last Chip Package Congress just passed.

How MANY times does Intel have to fail for people to figure it out? Additionally, the chip shortage is partially their fault because their FAB manufacturing has been in disarray for years. Anyone who thinks Intel is going to regain a competitive advantage is either in Intel's back pocket getting paid $ to pump Intel, clueless about how FABs work, or both.

TSMC, Samsung, Nvidia, AMD would have to stop innovating while at the same time Intel meeting their obligations ON TIME. Both are highly unlikely.

I believe in Intel. Chips made by Intel are already performed better than either AMD or NVDA now. If its manufacturing process gets ahead than TSMC, Intel will be the king of all. America always wins, fyi.
You're wrong. Intel wouldn't be purchasing chips from TSMC otherwise. And Intel server CPUs wouldn't be 2 generations behind AMD.
Does America always win? Tell that to the American Tax Payers who have continued to dump cash into the Intel Dumpster Fire.
$6,004,762,638 in State and Federal Subsidies since 1993
$291,850,000 in State and Federal Loan Guarantees and bail-out assistance since 1993.
source: Intel | Subsidy Tracker (goodjobsfirst.org)
 

spongiemaster

Estimable
Dec 12, 2019
1,858
930
3,560
0
TSMC, Samsung, Nvidia, AMD would have to stop innovating while at the same time Intel meeting their obligations ON TIME. Both are highly unlikely.
AMD and Nvidia? Do you have any idea what you are talking about? Nvidia has never had any fabs. AMD stopped innovating in the fab industry 13 years ago when they spun off global foundries because they couldn't afford the business any more. Then in 2018 GloFo abandoned developing leading edge nodes.

TSMC just got $3.5 billion in subsidies from the Japanese gov't. South Korea (Samsung's home) announced last year plans to invest $450 billion over the next 10 years in chip manufacturing. Intel getting $6 billion in subsidies since 1993 is pretty irrelevant. Intel spent over $15 billion in R&D last year. So let's not pretend like the average American is funding Intel's development budget.
 
Reactions: KyaraM and Mandark

Jimbojan

Honorable
May 17, 2017
42
11
10,535
0
Tbh, for me performance isn't number one. Higher chip density, low TDP and efficiency is more important that clock speed for various apps.
It is not true, higher frequency is one indication the chip internal architecture, if the chip design is not sophisticated enough, one cannot run very high frequency no matter how hard you try. It will either burn your chip or acting erratically. Therefore, it is the physics and knowledge Intel learnt, no other companies can compete so far. Fyi.
 
It is not true, higher frequency is one indication the chip internal architecture, if the chip design is not sophisticated enough, one cannot run very high frequency no matter how hard you try. It will either burn your chip or acting erratically. Therefore, it is the physics and knowledge Intel learnt, no other companies can compete so far. Fyi.
Hahahaha!!! That’s why AMD took the game in crown right?

https://forums.tomshardware.com/threads/amd-ryzen-7-5800x3d-review-3d-v-cache-powers-a-new-gaming-champion.3758895/page-4#post-22669312

Basically Intel can’t compete. their chips use way too much power and run way too hot and until they stop being stupid this isn’t going to end. They are not anything to be proud of. The only advantage they have is the size of their company
 
Last edited:
Reactions: GenericUser

KyaraM

Prominent
Mar 11, 2022
501
158
590
8
Hahahaha!!! That’s why AMD took the game in crown right?

https://forums.tomshardware.com/threads/amd-ryzen-7-5800x3d-review-3d-v-cache-powers-a-new-gaming-champion.3758895/page-4#post-22669312

Basically Intel can’t compete. their chips use way too much power and run way too hot and until they stop being stupid this isn’t going to end. They are not anything to be proud of. The only advantage they have is the size of their company
Yeah, because gaming is everything that counts, right, nothing else/s

A German tech magazine ran both the 5800X3D and the 12900KS at the highest officially supported RAM speeds, DDR4-3200 for the 5800X3D and DDR5-4400 for the 12900KS respectively. From what I remember early tests showing, DDR5 has a performance advantage, but very fast DDR4 RAM should still give similar results as the slow DDR5 RAM used here. They came out basically equal (sorry, but a 0.1% difference is measurement tolerance, nothing else), with the 5800X3D leading in some games and the 12900KS in others, while utterly destroying the 5800X3D in productivity, meaning its the far better allrounder. This, however, always depends on the games tested, and looking at Hardware Unboxed, where the 12900K is only 1% behind the 5800X3D and the 12900KSnot even tested, the 5800X3D is far from being the processor with the best gaming performance overall. And while productivity might not matter to you, it certainly does to others, and being awesome in one area while garbage compared to similar chips (ie everything else in its price range) is not an indicator for absolute technical superiority. Also, other tests imply, or outright show, that higher RAM speeds give the 12900KS a lead over the 5800X3D. And even thr tests that saw them equal or the 5800X3D ahead said that it isn't a recommendation for any and all gamers, even, it highly depends on what you play and what else you want to do with the processor.

So tl;dr, it's still, as always, choose the CPU according to your needs, not according to what the company claims. You AMD fanboys might not understand that, but that's a simple, hard fact in everything computer related.

For me, I hope for a closer match-up in the server area in the near future even if I'm not really a direct stakeholder here. It can only benefit customers to not have a monopoly, as seen time and time again and again.

Edit: Also, Intel needs to work on power consumption, no question. However, considering that Alder Lake is already more efficient than previous chips, and Raptor Lake seemingly heading in the same direction, I think they can manage that, too.
 
Last edited:

KyaraM

Prominent
Mar 11, 2022
501
158
590
8
Ha ha ha their power efficiency is the worst in the industry and there’s really no sign of getting any better. The 5800 X 3D uses 1/3 the power of your beloved Intel processor

in the server arena the epyc processors have blown all of Intel stuff out of the water. there is nothing that can touch them nothing and as far as productivity AMD cpu can run virtual machines and do real world workloads far more efficiently with a lot less power so you just keep on dreaming there buddy

Nobody said gaming was everything and I’m not a gamer I’m more interested in the AMD products that allow me to run tons of applications and virtual machines at once something that I can’t find anywhere else.

Intel is horrible efficiency yet they continue down the road with their horrible processor designs that use tons of power what’s next 1000 W cpus?
Dude. I consider everything past a 12700K or 5900X <Mod Edit> for gaming anyways, since everyone but the uber enthusiasts will play in GPU limit anyways and they often do so, too, because they most likely will play in 4k. For gaming, anything above is only good if you feel you have to prove something, pure bragging rights bs. For everything else, the 5800X3D is <Mod Edit> considering it loses even against its 100 bucks cheaper brother. I got both a recent Intel and AMD system, too, so if anything, I follow my own advice to buy according to needs, not whatever you glorify most, and I will recommend whatever fits best for the needs of whoever is asking instead of only considering one company due to some perceived superiority.

Want to know something fun? I played a couple games on my old Kaby Lake system over the weekend. The system went to my parents after I switched it out for a 12700K-based in February. In gaming, the 12700K uses maybe 2-5 Watts on average more according to HWINFO, clocking in around 40-65W (EDIT: actually, now that I'm back home and at my main rig, I see that it actually uses less power than the old Laby... currently 31W vs. 38W of the old CPU) depending on the game, while running 10°C cooler than the old i5 in the same games with higher performance. Both are air cooled. Honestly, the GPU has a far higher impact on my electricity bill than the CPU will ever have, since it uses 40-50W more than the old one. I have also never seen it go past 160W even in Cinebench. Normal power efficiency charts only show half of the truth, too, since faster completion speeds lead to less power consumption. Tom's showed that a couple times before in their reviews. Under any rl circumstances, Alder Lake isn't far behind Ryzen, if you like it or not.

But why do I even try. You admitted yourself that you are biased, so this is a pointless waste of time...
 
Last edited by a moderator:

ASK THE COMMUNITY