News Intel Lands US Energy Department Funding for 2kW Cooling Tech

I’m sure that “Cooling Operations Optimized for Leaps in Energy, Reliability, and Carbon Hyperefficiency for Information Processing Systems” will totally offset the DOUBLING of power draw by these future chips... Total clowns.
 
I’m sure that “Cooling Operations Optimized for Leaps in Energy, Reliability, and Carbon Hyperefficiency for Information Processing Systems” will totally offset the DOUBLING of power draw by these future chips... Total clowns.
Either one server will double in power or they will use two servers for the same total power but the number of CPUs/servers will increase either way.
Nobody wants things to be slower, everybody wants everything to be faster and that means more processors.
This tech is supposed to reduce total power usage by decreasing the amount of power needed for the cooling.
This is important, as Intel claims that cooling currently accounts for up to 40% of total data center energy use. Intel says the teams assisting in this project are aiming to improve the two-phase system that is in development. Its ambitious goal is to improve the 0.025 degrees C/watt capability of the existing system by 2.5X or more.
 
Either one server will double in power or they will use two servers for the same total power but the number of CPUs/servers will increase either way.
Nobody wants things to be slower, everybody wants everything to be faster and that means more processors.
This tech is supposed to reduce total power usage by decreasing the amount of power needed for the cooling.
Servers aren’t going to double in power. They’re going to double in power draw.

“Nobody wants things to be slower, everybody wants everything to be faster and that means more processors.”

I don‘t have a response for that that won’t get me banned. But in my more patient years, I’d have argued for the typical increases in IPC, memory access, smarter caches, etc.
 
Also they wouldn't be enough on their own, the industry will always need more actual cores alongside whatever else.
Almost everything eventually reaches a "good enough" or "practical limit" point. We are seeing that in the consumer space with the on-going slowdown in PC sales and the start of decline in mobile device sales. Datacenter will likely follow a decade or two behind.
 
  • Like
Reactions: gg83 and atomicWAR
Datacenter will likely follow a decade or two behind.
Because there will be fewer people every year and they will be wanting their videos (whatever) provided much slower and in always lower resolutions?
If we ever reach a point where the population will stay the same and every single person will already have a device and no newer higher resolution comes out and and and...
I don't see them slowing down for a long while unless they hit some physical limit.
 
  • Like
Reactions: jp7189 and bit_user
If we ever reach a point where the population will stay the same and every single person will already have a device and no newer higher resolution comes out and and and...
I don't see them slowing down for a long while unless they hit some physical limit.
What higher resolutions? From a comfortable sitting distance away from the screen, there isn't that much of a quality increase in motion video going form 1080p to 4k. Only a tiny fraction of the population will ever bother with more than UHD, the practical limit of human vision has been passed already.
 
  • Like
Reactions: gg83 and atomicWAR
What higher resolutions? From a comfortable sitting distance away from the screen, there isn't that much of a quality increase in motion video going form 1080p to 4k. Only a tiny fraction of the population will ever bother with more than UHD, the practical limit of human vision has been passed already.
Just wait until we get the total recall my whole wall is a TV screen future...
8k is just the beginning.
This will be in our living rooms in the future, we might not live that long but then again prices drop really fast all the time.
 
  • Like
Reactions: gg83
Just wait until we get the total recall my whole wall is a TV screen future...
8k is just the beginning.
If you sit far enough to comfortably be able to see the whole image, 4k is about as good as the human eye can resolve no matter how large the screen is since you end up having to stand/sit increasingly further away from the screen linearly with screen size, especially once you throw motion at it where you cannot fixate minute pixel details before they get replaced by something else.
 
Integr8d said:
I’m sure that “Cooling Operations Optimized for Leaps in Energy, Reliability, and Carbon Hyperefficiency for Information Processing Systems” will totally offset the DOUBLING of power draw by these future chips... Total clowns.
The press release is ridiculous, its not about total power at all, its about power density. And this won't do anything to reduce the cooling costs of a datacenter which would be mostly the air conditioning for the building.

Also this makes no sense :
from 0.025 °C/watt to less than 0.01 °C/watt, or 2.5 times (or more) improvement in efficiency.
Does that mean that the efficiency right now costs 1000Watts to get s 25C delta, and they will "improve" it by using 1000Watts to get at 10C delta?
 
If you sit far enough to comfortably be able to see the whole image, 4k is about as good as the human eye can resolve no matter how large the screen is since you end up having to stand/sit increasingly further away from the screen linearly with screen size, especially once you throw motion at it where you cannot fixate minute pixel details before they get replaced by something else.
There is subset of users like myself using large panel displays for monitors. Both for real estate with productivity and for games. Point being at my desk I can still see pixels playing FPS with keyboard and mouse. So I'd benefit from 8K but after that I can't see me needing any more pixels as they are just barely visible now and I have to be 3 feet from my 43 inch monitor. The thing is the number of users like myself is so niche it won't make much sense for panel makers to keep pushing these higher pixel counts. The R & D just won't pay off.

Long story short you're so on the money here. We have reached the point of diminishing returns and when 8K drops mainstream those returns will have died all together in the last niches like mine.
 
Long story short you're so on the money here. We have reached the point of diminishing returns and when 8K drops mainstream those returns will have died all together in the last niches like mine.
I'm not foreseeing 8k becoming mainstream within the next 20+ years. Even UHD is likely another ~10 years away from displacing FHD as the dominant mainstream monitor resolution. Most PC/laptop users don't need/want it yet, at least not at any sort of premium.

I have a 50" UHD TV that I use as a secondary monitor but still use a 24" 1200p monitor as my primary monitor for pretty much everything. I tried using the TV as primary for a while but simply couldn't find any arrangement where I could comfortably use a screen that large this close. 32" is probably the most I may be able to comfortably use as a desk monitor.
 
If you sit far enough to comfortably be able to see the whole image, 4k is about as good as the human eye can resolve no matter how large the screen is since you end up having to stand/sit increasingly further away from the screen linearly with screen size, especially once you throw motion at it where you cannot fixate minute pixel details before they get replaced by something else.
I definitely agree but, I remember when people said 32bit was the best graphics can get. others said we'd never see faster than 1ghz CPU in our lifetime. I don't know much but some people say never and then it happens. framerates another example. some say 30fps is the fastest the eye can see but 300fps has an impact on making moving images look smoother? or something like that? there might be a benefit to 8k that I can't think of now.
 
Almost everything eventually reaches a "good enough" or "practical limit" point. We are seeing that in the consumer space with the on-going slowdown in PC sales and the start of decline in mobile device sales. Datacenter will likely follow a decade or two behind.
I look at it another way: perhaps it's precisely the decoupling between edge & cloud that has slowed the increase in demand on edge-device performance.

Another point to consider is that large parts of the world are still catching up in connectivity and adoption of computing devices and services. That will naturally fuel growth in datacenters, even if the amounts of data (per-capita) and kinds of processing didn't.

It also seems like you're completely ignoring AI, which is fueling innovation and demand for compute, at both the edge and in the cloud. And, as we've seen many times before, the cheaper it gets, the more places it will become economically viable to use, thus keeping the demand cycle going.

What higher resolutions? From a comfortable sitting distance away from the screen, there isn't that much of a quality increase in motion video going form 1080p to 4k. Only a tiny fraction of the population will ever bother with more than UHD, the practical limit of human vision has been passed already.
Take driverless cars, for instance. You thought video was only for people to film on their phone or consume on their TVs, but cars can easily produce half a dozen 4k+ datastreams each. They don't all need to be stream to the cloud or analyzed, but then neither do videos from phones. It's just an example of how technology finds new applications, over time.
 
Last edited:
If you sit far enough to comfortably be able to see the whole image, 4k is about as good as the human eye can resolve no matter how large the screen is since you end up having to stand/sit increasingly further away from the screen linearly with screen size, especially once you throw motion at it where you cannot fixate minute pixel details before they get replaced by something else.
I will agree with you, on this point. For me to comfortably use my tiny, pixel-optimized fonts on a 4k monitor, it needs to be about 35" - 40". These are the same fonts I used ever since I started using LCDs. Sure, I could use bigger fonts, but then I'd be wasting pixels.

Anyway, the problem with such a large monitor is that my current 32" 4k work monitor is already almost too big, physically. The larger it gets, the more I have to turn my head. Day after long day, that gets tiring and tedious for my aging neck. Not only that, but a larger (flat) screen requires more refocusing of the eyes, as you move between the center & corner. So, I'll agree with you that 4k is close enough to the natural limit that I don't plan on going further.

It's also wasteful of GPU resources to render higher resolutions, and why we see hacks like variable-resolution, TAA, DLSS, etc.
 
this won't do anything to reduce the cooling costs of a datacenter which would be mostly the air conditioning for the building.
Why do you need to cool the entire building? Liquid cooling lets you route the heat to where it can be more efficiently disposed of. Not only that, but phase-change makes it more efficient than directly air-cooling the CPUs.
 
I definitely agree but, I remember when people said 32bit was the best graphics can get. others said we'd never see faster than 1ghz CPU in our lifetime.
These were both naive statements, even at the time, and one misunderstands perceptual acuity while the other mispredicts technology trends.

The professional video industry has defined standards for 10-bits per channel (and higher), since the mid-90's. Back then, I remember hearing about how film telecines used 12-bit log, and one professional CGI package used 16-bit linear for film work. Nobody who ever said "32-bit was the best graphics can get" had studied color science or human visual perception.

framerates another example. some say 30fps is the fastest the eye can see
More ignorant statements by people who had no firm basis for making them. Not only that, but it also matters what kind of display technology you're talking about.
 
It also seems like you're completely ignoring AI, which is fueling innovation and demand for compute, at both the edge and in the cloud. And, as we've seen many times before, the cheaper it gets, the more places it will become economically viable to use, thus keeping the demand cycle going.
Just because one may be able to apply AI at a given subset of problems doesn't necessarily means you should. I bet there will be no shortages of situations where AI isn't sufficiently consistently consistent, accurate and reliable to be trusted. Trust AIs too much and you may get rewarded with more spectacular failures than any handful of humans could unintentionally produce since you may have exactly the same catastrophic failure modes repeated as many times as there are AIs running the same or similar enough models.
 
Just because one may be able to apply AI at a given subset of problems doesn't necessarily means you should. I bet there will be no shortages of situations where AI isn't sufficiently consistently consistent, accurate and reliable to be trusted. Trust AIs too much and you may get rewarded with more spectacular failures than any handful of humans could unintentionally produce since you may have exactly the same catastrophic failure modes repeated as many times as there are AIs running the same or similar enough models.
As recently as the mid-90's, I saw an opinion piece in a reputable publication (though I forget which), making the case that computers hadn't been an economic net-positive. It's easy to be a doubting Thomas, when you lack good data.
 
These were both naive statements, even at the time, and one misunderstands perceptual acuity while the other mispredicts technology trends.

The professional video industry has defined standards for 10-bits per channel (and higher), since the mid-90's. Back then, I remember hearing about how film telecines used 12-bit log, and one professional CGI package used 16-bit linear for film work. Nobody who ever said "32-bit was the best graphics can get" had studied color science or human visual perception.


More ignorant statements by people who had no firm basis for making them. Not only that, but it also matters what kind of display technology you're talking about.
So what your saying is it's a very complicated thing to try to make generalized statements? Even the #-bit labeling for graphics was misunderstood by the general public. Like NES was 8-bit, SNES was 16-bit, PS1 was 32-bit, N64 64-bit. But in reality these numbers weren't relevant?
 
As recently as the mid-90's, I saw an opinion piece in a reputable publication (though I forget which), making the case that computers hadn't been an economic net-positive. It's easy to be a doubting Thomas, when you lack good data.
I'd have a hard time believing anyone worth their salt could make that statement in the 90s and not get laughed out of the door since computers of all sorts were already everywhere in the 90s and the benefits quite clear already. That statement may have been more at home in the early to mid 80s.

Doubting AIs does not require data: the very nature of how AIs are trained by throwing data at the learning model and hoping for the best leaves the door wide open to unforeseen outcomes.
 
the very nature of how AIs are trained by throwing data at the learning model and hoping for the best leaves the door wide open to unforeseen outcomes.
This argument carried water until deep learning started outperforming classical approaches in many fields, about a decade ago. It doesn't have to be perfect. It just has to be better than the best alternatives.

Also, deep learning is not alone in its potential for misapplication. In the preceding era, many badly-conceived and buggy conventional implementations were deployed. The same practitioners likely to misapply deep learning would probably fare even worse, without it.

In favor of deep learning, there are better tools for training than
 
Integr8d said:

The press release is ridiculous, its not about total power at all, its about power density. And this won't do anything to reduce the cooling costs of a datacenter which would be mostly the air conditioning for the building.
Fans are used to cool things, but they themselves use power, generate heat, and add no benefit to the ultimate goal of compute power. The less elements used just to move heat from place to another the better, and this system is doing that.