beyondlogic
Splendid
didnt this bite them on the rear end last time what have they done to stop it happening a second.
Last edited:
Intel has been a major client of TSMC for years. As recently as 2019, before TSMC was fabbing CPUs for Intel, Intel was still a bigger client than AMD was. These international mega corporations aren't run by fanboys. As long as the checks are clearing, they will do business together.I thought they just started using them a couple of years ago, if that.
Intel really needs to figure out what the [] they are doing.
Yes, it wasn't CPUs TSMC was making for them. If you recall, Intel had a major fab crunch back then, because they had expected their 10 nm production lines to be handling a fair bit of product volume. Due to the delays in getting a working 10 nm node, all of their products had backed up onto their 14 nm production, which was further stressed by the need for them to add more cores (hence, larger dies) in order to try and keep pace with AMD.Intel has been a major client of TSMC for years. As recently as 2019, before TSMC was fabbing CPUs for Intel,
Back then, Intel wasn't a serious player in the foundry business. The conflict of interest for TSMC was minimal. Basically, helping Intel just meant that AMD's sales would possibly be lower, which meant fewer orders from AMD. However, if Intel's orders were enough to offset that, then it wouldn't make much difference to TSMC.These international mega corporations aren't run by fanboys. As long as the checks are clearing, they will do business together.
Source? I'd love to know specifically what you're referring to.In fact, these are not only Intel's problems, TSMC is starting to admit problems with new process technologies,
This might be a comfortable opinion to hold, but it's not supported by the data. China is leading in publication of papers in the most prestigious western scientific journals, such as Nature. In order to get published, you have to get your paper through their reviewers. So, this distinction isn't something you can accomplish without many researchers of real merit.naturally, high-quality human capital will never appear in totalitarian countries, in such quantities as to create an extensive base, as a foundation for a new breakthrough and mass production. This requires an emancipated consciousness from childhood, which leads to a creative breakthrough.
Every time "the end of technology" was predicted in the past, it's been wrong. Technology is a virtuous circle, where advancements in one area lead to advancements in others.we are heading into a technological dead end for the next 15-20 years.
I'll find it later, I have it somewhere in my bookmarks, from some analytical sites, but it will take time. Try to find it yourself - after all, Google is quite smart now, isn't it?Source? I'd love to know specifically what you're referring to.
This is an inconvenient opinion, but it is an irrefutable fact throughout the history of mankind. The societies that won were those with more freedom, including for free education without censorship. The USA rose on this, having managed to absorb and then grow the best human capital, which gradually gave it advantages over other countries and blocs. As the last country, although its history looks bloody. But where else? Now the most passionate people have nowhere to run on the planet - there are more "blank" spots left.This might be a comfortable opinion to hold, but it's not supported by the data. China is leading in publication of papers in the most prestigious western scientific journals, such as Nature. In order to get published, you have to get your paper through their reviewers. So, this distinction isn't something you can accomplish without many researchers of real merit.
You see, the thing is that the human brain, as a biochemical machine, is finite in its creative and insightful capabilities (with the maximum coverage of scientific horizons), no matter how much we would like the opposite. And even larger, even well-coordinated teams will no longer be able to bear such fruits as before, with the exponential growth of the complexity of scientific knowledge and its maintenance between generations. Even with all the means of automation and computerization of such work. Somewhere new frontiers and questions are being developed, but too slowly. That is why all eyes have turned to "AI", but for now this is more of a marketing gimmick for cutting up the huge budgets of those who believed in the new "silver bullet". And the real problems with population growth (another +80-82 million - or immediately +10 cities the size of New York in 2024 alone, although so far without its consumption level, if you believe the data in the press) are only growing. At the same time, the overall efficiency of each individual gradually falls with the growth of the complexity of scientific and technological progress - more and more people are engaged in unproductive "labor" from the point of view of the tasks of effective survival of civilization. And they cannot be "fired" in any way, as in companies... This is a stalemate that cannot be resolved well in the next 50 years.Every time "the end of technology" was predicted in the past, it's been wrong. Technology is a virtuous circle, where advancements in one area lead to advancements in others.
That said, progress definitely slows in certain areas, while the pace picks up elsewhere. Back in the 1950's, people seriously predicted that electricity would be so low-cost that it'd be effectively free. That's because they assumed advances in power generation would continue at the same brisk pace, but they didn't. However, virtually nobody in the 1950's foresaw the kind of revolution that would arise from the combined advancement in digital computing and communications technologies. This shows why progress is hard to predict.
This is false, they are highly competitive at AI, which is why US gov't so eager to ban it. Talk to any AI research about how competitive China is at AI research. The whole "patents are garbage" is some massive cope.Most of China's patents are useless garbage from the point of view of the edge of science. This has also been discussed many times in analytics. There are even sites that conduct fake scientific work from China, in huge quantities, so there are more than enough crooks there.
Well, all that remains is for them to raise their production base to the level of the USA and its allies. Time will soon show what chances the Chinese leadership and their society have to survive the impending global cataclysms. And where the real intellectual layer (in the broad sense) is more powerful.Talk to any AI research about how competitive China is at AI research.
2N vs. N3E:Source? I'd love to know specifically what you're referring to.
---“The nanosheet architecture actually is the final frontier of transistor architecture,” Ashish Agrawal, a silicon technologist in Intel’s components research group, told engineers. Even future complementary FET (CFET) devices, possibly arriving in the mid-2030s, are constructed of nanosheets. So it’s important that researchers understand their limits, said Agrawal.
They didn't change process nodes with Blackwell, but rather use an optimized version of the same custom node as Ada.NVidia's 5xxx as an example, we can see that they chose, with a real +5-10% over 4xxx with the same consumption, they raised consumption by +27-28% at the same time.
The density part is just an improvement it isn't conditional.2N vs. N3E:
You have a choice (with growing huge costs for each new step):
1. Reduce consumption by -25-30%, but without any increase in performance.
2. Without reducing consumption, add a shameful +10-15% of performance.
3. Do not add either performance or reduce consumption, but increase the complexity of the circuits by +15%.
What do you choose when you normally need everything at once?
There's a misunderstanding here, in the the way they characterize node improvements. Each process node has a curve that shows the tradeoff between performance vs. power. What they're citing (and this is standing practice for them, going back a long ways) is how the curve for their new process node compares to an older one, but at just two points - the point of equal power and the point of equal performance.2N vs. N3E:
You have a choice (with growing huge costs for each new step):
1. Reduce consumption by -25-30%, but without any increase in performance.
2. Without reducing consumption, add a shameful +10-15% of performance.
3. Do not add either performance or reduce consumption, but increase the complexity of the circuits by +15%.
What do you choose when you normally need everything at once?
Again, you're missing the point of these comparisons. They're not saying you would normally take an existing design, port it to their new node, and call it a day. What people usually do is use some of the additional density & power budget to make a more complex design that achieves performance gains through higher IPC. Then, they use the remaining perf/W budget to add a little more performance via higher frequencies.People will not buy if there is no improvement in performance.
In the case of the PS5 Pro they leveraged a newer process node which allowed the chip to be about the same size and power consumption as the PS5. This gives Sony the same type of benefits refreshes do with regards to manufacturing and then anyone buying one gets the benefit of it not using a lot more power than the base model.A counter-example might be how someone like Sony will do several refreshes of the PS5, over its lifespan. Other than the "Pro" model, which uses a different underlying design, all PS5 have the same performance. The point of their refreshes is mostly about cost reduction. If you can achieve some power savings at the SoC level, then you can save money elsewhere in the console, by using a smaller power supply, smaller thermal solution, etc. This also enables size & weight reductions elsewhere in the system, all of which also results in savings on shipping and warehousing costs. So, there are cost-savings to be had, even beyond the component parts.
If we look at WW2, Germany and Japan were both very strong, technically. The reasons they lost weren't primarily due to things like freedom or technological sophistication, but a lot more to do with the scale of their ambitions.This is an inconvenient opinion, but it is an irrefutable fact throughout the history of mankind. The societies that won were those with more freedom, including for free education
The patent system is broken. There are vast numbers of junk patents filed in the US, also. I wouldn't judge technological advancement on the basis of patents. A much better metric is to focus on research and scientific publications.Most of China's patents are useless garbage from the point of view of the edge of science.
Science works to improve the volume and accuracy of knowledge (i.e. at a societal level), where there are indeed some scaling problems - especially in more applied sciences.You see, the thing is that the human brain, as a biochemical machine, is finite in its creative and insightful capabilities (with the maximum coverage of scientific horizons), no matter how much we would like the opposite. And even larger, even well-coordinated teams will no longer be able to bear such fruits as before, with the exponential growth of the complexity of scientific knowledge and its maintenance between generations.
Society problems do face scaling challenges, but cities are a lot more efficient than if the same number of people lived in a suburban or rural setting. You can easily see this in New York City's relatively low per-capita energy usage, compared to elsewhere in the US.And the real problems with population growth (another +80-82 million - or immediately +10 cities the size of New York in 2024 alone, although so far without its consumption level, if you believe the data in the press) are only growing. At the same time, the overall efficiency of each individual gradually falls with the growth of the complexity of scientific and technological progress
Population growth is leveling off. One of the biggest problems is actually how the population is distributed, with a lot of wealthy countries facing the prospect of population declines, while the highest growth is in the poorest countries.Humanity has grown, since the first flight into space, almost 3 times in number.
Colonizing space will never solve the big problems we have on earth. There might be real value in mining the moon or asteroids, but there aren't many metallic asteroids and they're all pretty far, while mining the moon is quite an undertaking.But we are still on Earth and there is not even a colony on the Moon, which is literally nearby
Advancements in science and technology have the potential to reduce human resource consumption, relative to quality of life. That's not so much where the focus is, but it's possible.I hope that at least the problem with silicon will be solved, because humanity needs to greatly increase the "power-to-weight ratio" in calculations (and in energy) per individual, so that this finally begins to bring qualitative changes to civilization with a new frontal thrust...
Part of the problem we currently face is due to the massive demand for AI hardware. That's distorting prices and limiting availability of cutting-edge nodes beyond what would normally happen. It's an outlier, not trend (hopefully).The point is that the next steps will be shorter and less valuable for consumers, in these terms, and the costs will be higher for chipmakers.
Nope. As long as people have choice, they won't accept artificial aging in silicon. Unfortunately, chips do wear out and newer chips face this problem much more so than older ones did. NAND is a prime example, but similar things happen with DRAM and even logic.or they will start introducing elements of artificial aging into chips (both chips and software) with increasing persistence.
My post was 4 sentences long. You couldn't even make it to the 2nd one?No, it's not CPUs TSMC was making for them. If you recall, Intel had a major fab crunch back then, because they had expected their 10 nm production lines to be handling a fair bit of product volume. Due to the delays in getting a working 10 nm node, all of their products had backed up onto their 14 nm production, which was further stressed by the need for them to add more cores (hence, larger dies) in order to try and keep pace with AMD.
As recently as 2019, before TSMC was fabbing CPUs for Intel, Intel was still a bigger client than AMD was.
I did read the whole thing, but apparently too fast and I ended up misreading it. I thought you were saying they used TSMC to fab CPUs back in 2019.My post was 4 sentences long. You couldn't even make it to the 2nd one?