News Intel's 1500W TDP for Falcon Shores AI processor confirmed — next-gen AI chip consumes more power than Nvidia's B200

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
If we have a hypothetical Intel GPU at 1.5 kW, and it's 4x as fast as an AMD GPU that burns 0.75 kW, then you'd be better off running that one Intel chip instead of four of the AMD chips, assuming you need the performance.
Except that for 95% of desktop consumers even that hypothetical AMD 0.75 kW chip is a 5x overkill.

The real question is -- how many graphics cards have you seen (since 10x0 GTX series) which occupy a single PCI-Express slot nowadays?

That's right -- they almost don't make those anymore because they are chasing power for HPC and datacenters and that power has seeped into consumer segment all the way to its very bottom (i.e. low end).

What I am saying is that 18 years ago you could have gotten a single-slot passively cooled NVIDIA 7300 GT if you wanted a quiet low power PC for your living room. What's the thinnest, lowest of low-end cards that you can get today? What's its TDP? And more importantly, what's its price?

Yes, it has vastly better capabilities and speed and everything, but how many people actually NEED that?

Apple is the only adult in the room when it comes to power consumption and designing sensible (capable but not overpowered) machines for the masses.
 
  • Like
Reactions: slightnitpick
They have to pay for the power they use (not just electricity but space, cooling, whatnot) so they will want to make the best use out of it, in that sense they do self regulate because they can only spend that much money on that much performance before it doesn't make any sense anymore.
Companies have drastically lower electricity price than the consumers.

In my country companies get flat rate billing, while citizens like me are getting penalized for using more electricity. You use more than 500 kW/h a month? x2 price per kW/h. More than 1,500? x4 for you lad. More than (gasp!) 2,000 kW/h a month? x8 per kW/h. Oh, and did I mention that during day the price is 2x the night price.

So what I meant by imposing limits on power consumption is through penalizing it in some way. One way would be to make them pay more when they spend more but we all know that's only for peasants, not for the rich -- rich are always encouraged to spend more by having all sorts discounts in effect making them even more rich and making poor even more poor by not giving them any discounts or even outright penalizing them like I just described above.

I admit I don't know how to accomplish such regulation, but something has to be done or soon desktop PCs will need 3kW power supply and 3-phase power outlet.
 
  • Like
Reactions: slightnitpick
Companies have drastically lower electricity price than the consumers.
Yup.
Factories, farms, and data centers, etc. get special treatment compared to regular people.

Is the region in a drought? Citizens will get fined for over usage of water, but there are carve-outs for factories, farms, and data centers.
Dumping untreated waste water into a body of water? It's okay if you're a factory. If you're a farm, it's fine as long as it's accidental.
Does the region have power supply issues? citizens will face a 100x price increase, but factories and data centers get special treatment.
Power outage? The first to come back online are factories and data centers.
Rolling blackouts? Exemption for factories and data centers.
 
The real question is -- how many graphics cards have you seen (since 10x0 GTX series) which occupy a single PCI-Express slot nowadays?
Aren't integrated graphics in the more powerful desktop chips basically filling this role? (I'm a laptop person but this seems to be the trend from afar.)
 
Power draw and efficiency are completely different things.

Consumer desktop stuff, whether we are talking about cpus or gpus, are incredibly efficient nowadays. Just don't run them at absurd power limits and voila. For example a 4090 at 220w is faster than a 3090 at 520 watts.
They are more power efficient in a sense for the same performance, the 4090 draws significantly less power. But that is not how it works in reality. Most people buying a 4090 will run it at whatever power limit it is set by default. Similarly in this case, a 1500W solution will likely be running close to that limit because it’s going to be under load most of the time. And I don’t believe people buying enterprise or data Center type of hardware will think about undervolting or tweaking the power limit. As expected, all these AI push will bump into a hard wall because of power limitations.
 
Except that for 95% of desktop consumers even that hypothetical AMD 0.75 kW chip is a 5x overkill.
Not sure why you're talking about "desktop consumers", when Falcon Shores is a server chip.

The real question is -- how many graphics cards have you seen (since 10x0 GTX series) which occupy a single PCI-Express slot nowadays?
Nvidia workstation cards (formerly Quadro-branded) are often single-slot, until you get into the upper-end models. However, they tend to be cut-back in weird ways, compared to the consumer cards built using the same GPU.


That's right -- they almost don't make those anymore because they are chasing power for HPC and datacenters and that power has seeped into consumer segment all the way to its very bottom (i.e. low end).
Huh? Their lower-power datacenter GPUs are also single slot.


What I am saying is that 18 years ago you could have gotten a single-slot passively cooled NVIDIA 7300 GT if you wanted a quiet low power PC for your living room. What's the thinnest, lowest of low-end cards that you can get today? What's its TDP? And more importantly, what's its price?
What's funny is you don't even know the real reason you're mad.

...but go ahead and blame the industry.

Yes, it has vastly better capabilities and speed and everything, but how many people actually NEED that?
You're talking to someone who mainly uses iGPUs. The rise of the iGPU is why there are so few low-end GPUs out there.

BTW, a guy at work had a Dell compact desktop and was wondering what he could do to get better gaming performance. I pointed him at a couple AMD RX 6400 graphics cards that were the most powerful thing that would fit his machine, at the time, and he got one. He was pleased with the performance improvement, but complained about how noisy the fan was. Those might've been single-slot... I forget.

Apple is the only adult in the room when it comes to power consumption and designing sensible (capable but not overpowered) machines for the masses.
At work we have Dell compact desktops (maybe that's what inspired my co-worker to buy one for home?) with i9 CPUs and they're okay. You can find plenty of mini-PCs and NUCs, too. A trend that seems to be growing in popularity is to build mini-PCs around upper-end laptop CPUs, some of which have a fairly decent iGPU.
 
  • Like
Reactions: slightnitpick
Companies have drastically lower electricity price than the consumers.
Not where I live. Here, commercial rates are much higher than residential rates.

In my country companies get flat rate billing, while citizens like me are getting penalized for using more electricity. You use more than 500 kW/h a month? x2 price per kW/h. More than 1,500? x4 for you lad. More than (gasp!) 2,000 kW/h a month? x8 per kW/h. Oh, and did I mention that during day the price is 2x the night price.
Seems a bit drastic. Is there a shortage?

something has to be done or soon desktop PCs will need 3kW power supply and 3-phase power outlet.
It's really just high-end CPUs and GPUs, I think. Both Intel and AMD have decent 65W options, with Intel even providing the T-series CPUs that feature a 35 W PL1. The machines we use at work have 65W i9's, although I set mine to use a PL1 of 80 W, which its cooling system can easily sustain in the climate-controlled room where I have it.

At home, I just built a mini-PC with a N97 (PL1 = 12 W) to use for the summer months, since it gets too hot to turn on my main Linux workstation in the upstairs room where I do most of my computing.
 
The news I saw on Falcon Shores indicated launch device was only going to be liquid cooled. With any luck this is going to be delivering the performance with a more focused product.
My guess is they decided to "pull out all the stops" for simplifying the product and getting their development team back on track. They can't afford another disaster like Ponte Vecchio.
I'd like to think all of the self imposed failings with Ponte Vecchio means whatever comes next will at least be smoother. I still can't get over the lack of built in testing which caused them to have to toss assembled chips.
Similarly in this case, a 1500W solution will likely be running close to that limit because it’s going to be under load most of the time. And I don’t believe people buying enterprise or data Center type of hardware will think about undervolting or tweaking the power limit.
It depends on the scale of the implementation because so many of these systems are becoming dense enough they're running into rack power limits. Many systems are being way more optimized than ever before.
Yup. That's what I was going to say. The heterogeneous version was either delayed or canceled.
Definitely canceled.
What I am saying is that 18 years ago you could have gotten a single-slot passively cooled NVIDIA 7300 GT if you wanted a quiet low power PC for your living room. What's the thinnest, lowest of low-end cards that you can get today? What's its TDP? And more importantly, what's its price?
It's called integrated graphics, but there are some A310/A380 cards at reasonable prices.
 
  • Like
Reactions: bit_user
Power outage? The first to come back online are factories and data centers.
They tend to be located in or near power corridors, which works out pretty nicely since a lot of people don't like to live near high-voltage lines for some reason. Those transmission lines are always the ones that get fixed first, because they're used by the greatest number of people.

Rolling blackouts? Exemption for factories and data centers.
Geez, what country do you live in? Obviously, you're not going to plunge a datacenter into blackout... but then, I'd expect most datacenters wouldn't be located in such electricity-scare regions.
 
And I don’t believe people buying enterprise or data Center type of hardware will think about undervolting or tweaking the power limit. As expected, all these AI push will bump into a hard wall because of power limitations.
That hardware is running much slower by default, there are no gaming mobos for datacenters....
they don't have insane overclocking settings enabled by default.
Its PL2 is 92 W. That works out to 6.6 W per core, when boosting.
But you are not forced to boost to make the CPU work, it will work at its 35W TDP as well and you can even lock it down and prevent it from boosting altogether.
 
But you are not forced to boost to make the CPU work, it will work at its 35W TDP as well and you can even lock it down and prevent it from boosting altogether.
You're not forced to use the default Power Limits, either. I'm just saying what the manufacturer-recommended behavior is, which presumably is what most OEMs will implement.
 
Aren't integrated graphics in the more powerful desktop chips basically filling this role? (I'm a laptop person but this seems to be the trend from afar.)
Not really, they have become quite capable, but they still lag behind discrete graphics cards when it comes to gaming performance.
Not sure why you're talking about "desktop consumers", when Falcon Shores is a server chip.
I thought I explained that, have you skipped over it?

I said that the power and cooling requirements for CPUs and GPUs over the last couple of years seem to be seeping from HPC / datacenter into the desktop market because chip manufacturers sell basically the same HPC / datacenter chips just a bit gimped for desktop use -- making more lithography masks to produce chips which target considerably different requirements is way more expensive and hence avoided.

TL;DR -- They are designing CPUs and GPUs for (HPC / datacenter) maximum speed and power they can get away with, not for (desktop computing) power efficiency, ease of cooling, and small footprint.
Nvidia workstation cards (formerly Quadro-branded) are often single-slot, until you get into the upper-end models. However, they tend to be cut-back in weird ways, compared to the consumer cards built using the same GPU.
Those cards work only with Quadro drivers, they aren't exactly for gaming use. They also are priced exorbitantly compared to their 2.5x-3.5x PCI-Express slots desktop counterparts.
Huh? Their lower-power datacenter GPUs are also single slot.
Are you seriously expecting consumers to buy datacenter GPUs?!?
What's funny is you don't even know the real reason you're mad.
I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them even if it does damage to the society in terms of both more e-waste (larger chips + larger heatsinks + larger cases == more raw material to extract, process and toss in a landfill in the end) and wasted electricity (more CO2 == more climate damage). It's one of the most literal examples of socializing the costs of doing business.
...but go ahead and blame the industry.
Yeah, they are blameless. They just want more money, who can blame them for that?
You're talking to someone who mainly uses iGPUs. The rise of the iGPU is why there are so few low-end GPUs out there.
Those iGPUs are even worse -- they take CPU die space making CPUs more expensive. They also eat into CPU power and cooling budget while they aren't capable of even proper 1080p gaming.
At work we have Dell compact desktops (maybe that's what inspired my co-worker to buy one for home?) with i9 CPUs and they're okay. You can find plenty of mini-PCs and NUCs, too. A trend that seems to be growing in popularity is to build mini-PCs around upper-end laptop CPUs, some of which have a fairly decent iGPU.
What I was saying is that for example Mac Mini M2 is a sensibly designed machine -- well balanced power and performance. Compact desktops aren't too bad, but they suck with graphics if you want 1080p gaming.
 
  • Like
Reactions: slightnitpick
I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them even if it does damage to the society
I honestly wonder if monopolizing a business category would allow for a more diverse product range. In that the monopoly, having no competition, could afford to make more niche products knowing that they'd get 100% of the sales for each niche.

Theoretically yes, but undoubtedly not. Instead I think the only solution is to make it easier for small companies, sole proprietorships, and plain users to assemble working, high-quality goods from parts. But of course you need consistent sources of good parts for that too.

My wife has been running into this issue for years with respect to clothing, as so many manufacturers apparently use elastic/spandex sources that either come smelling of chemicals, or worse start smelling of chemicals only after washing. Making your own clothes is a process, and at most she'll strip out the elastic and sew in a replacement (before washing, so as not to contaminate the rest of the fabric with the chemical smell), but even that takes quite a bit of effort.

To get back on topic, at the very least a chiplet approach means that a honking 1.5 kilowatt server chip can be easily and straightforwardly fragmented into a bunch of consumer-grade smaller chips. I think this is likely as AI training will mostly be done in server farms, and a lot less power is needed for the typical consumer.
 
I honestly wonder if monopolizing a business category would allow for a more diverse product range.
NVIDIA already has this monopoly in GPU space.

I do not see diversity there, just scaled down HPC chips.
My wife has been running into this issue for years with respect to clothing, as so many manufacturers apparently use elastic/spandex sources that either come smelling of chemicals, or worse start smelling of chemicals only after washing.
Don't get me started on clothes. Can't buy a pure cotton sports socks if your life depended on it.

/rant on

If I were a conspiracy theorist I'd say that clothing manufacturers have been bought by the plastic industry as they seem to be dumping ever-increasing amounts of plastic waste into our clothes AND WE ARE PAYING FOR THE PRIVILEGE OF WEARING PLASTIC TRASH ON OUR SKIN.

/rant off
 
  • Like
Reactions: slightnitpick
/rant on

If I were a conspiracy theorist I'd say that clothing manufacturers have been bought by the plastic industry as they seem to be dumping ever-increasing amounts of plastic waste into our clothes AND WE ARE PAYING FOR THE PRIVILEGE OF WEARING PLASTIC TRASH ON OUR SKIN.

/rant off
Now made with recycled polyester!
https://www.theguardian.com/fashion...recycled-polyester-is-not-a-silver-bullet-yet
when a synthetic garment (eg recycled polyester, polyester, nylon or Lycra) is put through a washing machine, it sheds plastic microfibres that end up in our oceans, rivers and soil. The number of plastic microfibres entering the ocean by 2050 could accumulate to an excess of 22m tonnes and, alarmingly, a study conducted in 2015 found them in 67% of all seafood at fish markets in California.
https://www.sciencedirect.com/science/article/pii/S0045653522036682
However, circular use of plastics and textiles could lead to the accumulation of a variety of contaminants in the recycled product.
...
Various contaminants can end up in recycled plastic. Phthalates are formed during waste collection while flame retardants and heavy metals are introduced during the recycling process. Contaminants linked to textile recycling include; detergents, resistant coatings, flame retardants, plastics coatings, antibacterial and anti-mould agents, pesticides, dyes, volatile organic compounds and nanomaterials. However, information is limited and further research is required.
My cotton/poly Dickies would wear holes in the crotch area after about two years. I'm now about two years into wearing 100% cotton Carhartt pants and so far no wear in the crotch, and only a broken or worn belt loop or two from using them to hitch up my pants.

Polyester is probably cheaper, but it's also a way to greenwash clothes by making them part of the "circular recycled economy". Still, it's far better environmentally to have a set of virgin-cotton clothes that just doesn't fall apart than to use recycled clothes which do.

There is some good news from textile recycling, though: https://news.cornell.edu/stories/2023/07/blamed-fouling-environment-polyester-may-help-save-it
 
Last edited:
I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them even if it does damage to the society in terms of both more e-waste (larger chips + larger heatsinks + larger cases == more raw material to extract, process and toss in a landfill in the end) and wasted electricity (more CO2 == more climate damage). It's one of the most literal examples of socializing the costs of doing business.
You're irrationally mad about one size fits all when it's not even that so much as endless chasing of increasing performance causing the issue you have. Intel and AMD currently have the most efficient architectures they've ever had, but run them well outside their efficiency curves to maximize performance. Ada is the most efficient graphics architecture yet as well, but the demand for performance means increased power.
What I was saying is that for example Mac Mini M2 is a sensibly designed machine -- well balanced power and performance. Compact desktops aren't too bad, but they suck with graphics if you want 1080p gaming.
It's a mobile part slapped into a small box, congrats it's one size fits all, except it passes your purity test of being efficiency oriented.

Compact desktops are perfectly fine at gaming assuming they're using a mobile chip that actually has enough GPU cores. The primary advantage Apple has in this space is how much room they use for the GPU in their SoC as well as the DRAM being part of the package (and the extra bandwidth on chips above base model).
 
  • Like
Reactions: bit_user
the power and cooling requirements for CPUs and GPUs over the last couple of years seem to be seeping from HPC / datacenter into the desktop market because chip manufacturers sell basically the same HPC / datacenter chips just a bit gimped for desktop use
This is largely incorrect. With CPUs, Intel uses different dies with cores that are modified from their client variant. This is most readily observed in things like the size of L2 cache and the presence of AVX-512. With Golden Cove, although the client core had AVX-512, the server version adds a second, FMA-capable port. The server version of Golden Cove also features AMX.

In AMD's case, it's true they use the same chiplets for desktop, server, and high-end laptops. However, as I mentioned, the per-core power usage of server CPUs containing those chiplets is much less than we see on the desktop products where AMD uses them.

Regarding GPUs, Nvidia's datacenter GPUs that utilize the same dies as their client cards all feature substantially less power consumption than the consumer/gaming versions! For instance the L4 uses the same AD104 chip as the RTX 4070 range, which use between 200 and 285 W. Yet the L4 is rated at only 72 W. If we take another example, the L40 uses the AD102 die also found in the RTX 4090, yet that card rather famously features a 450 W TDP, while the L40 is rated at only 300 W.

TL;DR -- They are designing CPUs and GPUs for (HPC / datacenter) maximum speed and power they can get away with, not for (desktop computing) power efficiency, ease of cooling, and small footprint.
Maybe you should actually try reading sometime, because you got this one dead wrong. The facts are easily verifiable.

Those cards work only with Quadro drivers,
Source? As I said, Nvidia dropped Quadro branding years ago.

they aren't exactly for gaming use. They also are priced exorbitantly compared to their 2.5x-3.5x PCI-Express slots desktop counterparts.
You asked a question and got an answer. Now you're mad because you don't like the answer? You never stipulated these constraints. Ask a better question, next time.

Anyway, the point I made by citing them is that it's technically possible to sell modern graphics cards (that are certainly very capable at gaming, if not top performers) that fit one slot. It just turns out the benefits of making larger coolers are too juicy to pass up. Then, unsophisticated users might have started to associate cooler size with performance level, and thus we even get entry-level cards using two slots and sometimes with ridiculous triple-fan coolers.

I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them
What's funny about this is that it sure seems like they took gaming GPUs and repackaged them to use in servers, rather than the other way around, like you seem to believe.

Those iGPUs are even worse -- they take CPU die space making CPUs more expensive.
The iGPUs in Ryzen 7000 chiplet products are of pretty negligible size.

They also eat into CPU power and cooling budget while they aren't capable of even proper 1080p gaming.
iGPUs are fine for non-gaming purposes, which is what I use them for. I like not needing a dGPU in such systems.

At my job, we've even used OpenVINO to run modest inferencing workloads on them, in addition to using them for transcoding. For all those things, they're very power-efficient and cost-effective.

What I was saying is that for example Mac Mini M2 is a sensibly designed machine
Oh, and you want to complain about iGPUs?
 
Status
Not open for further replies.