The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Except that for 95% of desktop consumers even that hypothetical AMD 0.75 kW chip is a 5x overkill.If we have a hypothetical Intel GPU at 1.5 kW, and it's 4x as fast as an AMD GPU that burns 0.75 kW, then you'd be better off running that one Intel chip instead of four of the AMD chips, assuming you need the performance.
Companies have drastically lower electricity price than the consumers.They have to pay for the power they use (not just electricity but space, cooling, whatnot) so they will want to make the best use out of it, in that sense they do self regulate because they can only spend that much money on that much performance before it doesn't make any sense anymore.
Yup.Companies have drastically lower electricity price than the consumers.
The real question is -- how many graphics cards have you seen (since 10x0 GTX series) which occupy a single PCI-Express slot nowadays?
Aren't integrated graphics in the more powerful desktop chips basically filling this role? (I'm a laptop person but this seems to be the trend from afar.)The real question is -- how many graphics cards have you seen (since 10x0 GTX series) which occupy a single PCI-Express slot nowadays?
They are more power efficient in a sense for the same performance, the 4090 draws significantly less power. But that is not how it works in reality. Most people buying a 4090 will run it at whatever power limit it is set by default. Similarly in this case, a 1500W solution will likely be running close to that limit because it’s going to be under load most of the time. And I don’t believe people buying enterprise or data Center type of hardware will think about undervolting or tweaking the power limit. As expected, all these AI push will bump into a hard wall because of power limitations.Power draw and efficiency are completely different things.
Consumer desktop stuff, whether we are talking about cpus or gpus, are incredibly efficient nowadays. Just don't run them at absurd power limits and voila. For example a 4090 at 220w is faster than a 3090 at 520 watts.
Not sure why you're talking about "desktop consumers", when Falcon Shores is a server chip.Except that for 95% of desktop consumers even that hypothetical AMD 0.75 kW chip is a 5x overkill.
Nvidia workstation cards (formerly Quadro-branded) are often single-slot, until you get into the upper-end models. However, they tend to be cut-back in weird ways, compared to the consumer cards built using the same GPU.The real question is -- how many graphics cards have you seen (since 10x0 GTX series) which occupy a single PCI-Express slot nowadays?
Huh? Their lower-power datacenter GPUs are also single slot.That's right -- they almost don't make those anymore because they are chasing power for HPC and datacenters and that power has seeped into consumer segment all the way to its very bottom (i.e. low end).
What's funny is you don't even know the real reason you're mad.What I am saying is that 18 years ago you could have gotten a single-slot passively cooled NVIDIA 7300 GT if you wanted a quiet low power PC for your living room. What's the thinnest, lowest of low-end cards that you can get today? What's its TDP? And more importantly, what's its price?
You're talking to someone who mainly uses iGPUs. The rise of the iGPU is why there are so few low-end GPUs out there.Yes, it has vastly better capabilities and speed and everything, but how many people actually NEED that?
At work we have Dell compact desktops (maybe that's what inspired my co-worker to buy one for home?) with i9 CPUs and they're okay. You can find plenty of mini-PCs and NUCs, too. A trend that seems to be growing in popularity is to build mini-PCs around upper-end laptop CPUs, some of which have a fairly decent iGPU.Apple is the only adult in the room when it comes to power consumption and designing sensible (capable but not overpowered) machines for the masses.
Not where I live. Here, commercial rates are much higher than residential rates.Companies have drastically lower electricity price than the consumers.
Seems a bit drastic. Is there a shortage?In my country companies get flat rate billing, while citizens like me are getting penalized for using more electricity. You use more than 500 kW/h a month? x2 price per kW/h. More than 1,500? x4 for you lad. More than (gasp!) 2,000 kW/h a month? x8 per kW/h. Oh, and did I mention that during day the price is 2x the night price.
It's really just high-end CPUs and GPUs, I think. Both Intel and AMD have decent 65W options, with Intel even providing the T-series CPUs that feature a 35 W PL1. The machines we use at work have 65W i9's, although I set mine to use a PL1 of 80 W, which its cooling system can easily sustain in the climate-controlled room where I have it.something has to be done or soon desktop PCs will need 3kW power supply and 3-phase power outlet.
I'd like to think all of the self imposed failings with Ponte Vecchio means whatever comes next will at least be smoother. I still can't get over the lack of built in testing which caused them to have to toss assembled chips.My guess is they decided to "pull out all the stops" for simplifying the product and getting their development team back on track. They can't afford another disaster like Ponte Vecchio.
It depends on the scale of the implementation because so many of these systems are becoming dense enough they're running into rack power limits. Many systems are being way more optimized than ever before.Similarly in this case, a 1500W solution will likely be running close to that limit because it’s going to be under load most of the time. And I don’t believe people buying enterprise or data Center type of hardware will think about undervolting or tweaking the power limit.
Definitely canceled.Yup. That's what I was going to say. The heterogeneous version was either delayed or canceled.
It's called integrated graphics, but there are some A310/A380 cards at reasonable prices.What I am saying is that 18 years ago you could have gotten a single-slot passively cooled NVIDIA 7300 GT if you wanted a quiet low power PC for your living room. What's the thinnest, lowest of low-end cards that you can get today? What's its TDP? And more importantly, what's its price?
They tend to be located in or near power corridors, which works out pretty nicely since a lot of people don't like to live near high-voltage lines for some reason. Those transmission lines are always the ones that get fixed first, because they're used by the greatest number of people.Power outage? The first to come back online are factories and data centers.
Geez, what country do you live in? Obviously, you're not going to plunge a datacenter into blackout... but then, I'd expect most datacenters wouldn't be located in such electricity-scare regions.Rolling blackouts? Exemption for factories and data centers.
Its PL2 is 92 W. That works out to 6.6 W per core, when boosting.Second some logic the 13600t has 2.5w per core 1500w on a single chip? And people think it's too much lol.
It's different when your "chip" is an entire wafer. IIRC, those systems are indeed pretty energy-efficient.Cerebral use 15kW for a single chip (hack)
That hardware is running much slower by default, there are no gaming mobos for datacenters....And I don’t believe people buying enterprise or data Center type of hardware will think about undervolting or tweaking the power limit. As expected, all these AI push will bump into a hard wall because of power limitations.
But you are not forced to boost to make the CPU work, it will work at its 35W TDP as well and you can even lock it down and prevent it from boosting altogether.Its PL2 is 92 W. That works out to 6.6 W per core, when boosting.
You're not forced to use the default Power Limits, either. I'm just saying what the manufacturer-recommended behavior is, which presumably is what most OEMs will implement.But you are not forced to boost to make the CPU work, it will work at its 35W TDP as well and you can even lock it down and prevent it from boosting altogether.
Not really, they have become quite capable, but they still lag behind discrete graphics cards when it comes to gaming performance.Aren't integrated graphics in the more powerful desktop chips basically filling this role? (I'm a laptop person but this seems to be the trend from afar.)
I thought I explained that, have you skipped over it?Not sure why you're talking about "desktop consumers", when Falcon Shores is a server chip.
Those cards work only with Quadro drivers, they aren't exactly for gaming use. They also are priced exorbitantly compared to their 2.5x-3.5x PCI-Express slots desktop counterparts.Nvidia workstation cards (formerly Quadro-branded) are often single-slot, until you get into the upper-end models. However, they tend to be cut-back in weird ways, compared to the consumer cards built using the same GPU.
Are you seriously expecting consumers to buy datacenter GPUs?!?Huh? Their lower-power datacenter GPUs are also single slot.
I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them even if it does damage to the society in terms of both more e-waste (larger chips + larger heatsinks + larger cases == more raw material to extract, process and toss in a landfill in the end) and wasted electricity (more CO2 == more climate damage). It's one of the most literal examples of socializing the costs of doing business.What's funny is you don't even know the real reason you're mad.
Yeah, they are blameless. They just want more money, who can blame them for that?...but go ahead and blame the industry.
Those iGPUs are even worse -- they take CPU die space making CPUs more expensive. They also eat into CPU power and cooling budget while they aren't capable of even proper 1080p gaming.You're talking to someone who mainly uses iGPUs. The rise of the iGPU is why there are so few low-end GPUs out there.
What I was saying is that for example Mac Mini M2 is a sensibly designed machine -- well balanced power and performance. Compact desktops aren't too bad, but they suck with graphics if you want 1080p gaming.At work we have Dell compact desktops (maybe that's what inspired my co-worker to buy one for home?) with i9 CPUs and they're okay. You can find plenty of mini-PCs and NUCs, too. A trend that seems to be growing in popularity is to build mini-PCs around upper-end laptop CPUs, some of which have a fairly decent iGPU.
I honestly wonder if monopolizing a business category would allow for a more diverse product range. In that the monopoly, having no competition, could afford to make more niche products knowing that they'd get 100% of the sales for each niche.I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them even if it does damage to the society
NVIDIA already has this monopoly in GPU space.I honestly wonder if monopolizing a business category would allow for a more diverse product range.
Don't get me started on clothes. Can't buy a pure cotton sports socks if your life depended on it.My wife has been running into this issue for years with respect to clothing, as so many manufacturers apparently use elastic/spandex sources that either come smelling of chemicals, or worse start smelling of chemicals only after washing.
Now made with recycled polyester!/rant on
If I were a conspiracy theorist I'd say that clothing manufacturers have been bought by the plastic industry as they seem to be dumping ever-increasing amounts of plastic waste into our clothes AND WE ARE PAYING FOR THE PRIVILEGE OF WEARING PLASTIC TRASH ON OUR SKIN.
/rant off
https://www.sciencedirect.com/science/article/pii/S0045653522036682when a synthetic garment (eg recycled polyester, polyester, nylon or Lycra) is put through a washing machine, it sheds plastic microfibres that end up in our oceans, rivers and soil. The number of plastic microfibres entering the ocean by 2050 could accumulate to an excess of 22m tonnes and, alarmingly, a study conducted in 2015 found them in 67% of all seafood at fish markets in California.
...However, circular use of plastics and textiles could lead to the accumulation of a variety of contaminants in the recycled product.
My cotton/poly Dickies would wear holes in the crotch area after about two years. I'm now about two years into wearing 100% cotton Carhartt pants and so far no wear in the crotch, and only a broken or worn belt loop or two from using them to hitch up my pants.Various contaminants can end up in recycled plastic. Phthalates are formed during waste collection while flame retardants and heavy metals are introduced during the recycling process. Contaminants linked to textile recycling include; detergents, resistant coatings, flame retardants, plastics coatings, antibacterial and anti-mould agents, pesticides, dyes, volatile organic compounds and nanomaterials. However, information is limited and further research is required.
They aren't sports socks per se, but most or all of them are 100% cotton. You might have luck here: https://www.cottonique.com/collections/socks-bootiesDon't get me started on clothes. Can't buy a pure cotton sports socks if your life depended on it.
You're irrationally mad about one size fits all when it's not even that so much as endless chasing of increasing performance causing the issue you have. Intel and AMD currently have the most efficient architectures they've ever had, but run them well outside their efficiency curves to maximize performance. Ada is the most efficient graphics architecture yet as well, but the demand for performance means increased power.I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them even if it does damage to the society in terms of both more e-waste (larger chips + larger heatsinks + larger cases == more raw material to extract, process and toss in a landfill in the end) and wasted electricity (more CO2 == more climate damage). It's one of the most literal examples of socializing the costs of doing business.
It's a mobile part slapped into a small box, congrats it's one size fits all, except it passes your purity test of being efficiency oriented.What I was saying is that for example Mac Mini M2 is a sensibly designed machine -- well balanced power and performance. Compact desktops aren't too bad, but they suck with graphics if you want 1080p gaming.
This is largely incorrect. With CPUs, Intel uses different dies with cores that are modified from their client variant. This is most readily observed in things like the size of L2 cache and the presence of AVX-512. With Golden Cove, although the client core had AVX-512, the server version adds a second, FMA-capable port. The server version of Golden Cove also features AMX.the power and cooling requirements for CPUs and GPUs over the last couple of years seem to be seeping from HPC / datacenter into the desktop market because chip manufacturers sell basically the same HPC / datacenter chips just a bit gimped for desktop use
Maybe you should actually try reading sometime, because you got this one dead wrong. The facts are easily verifiable.TL;DR -- They are designing CPUs and GPUs for (HPC / datacenter) maximum speed and power they can get away with, not for (desktop computing) power efficiency, ease of cooling, and small footprint.
Source? As I said, Nvidia dropped Quadro branding years ago.Those cards work only with Quadro drivers,
You asked a question and got an answer. Now you're mad because you don't like the answer? You never stipulated these constraints. Ask a better question, next time.they aren't exactly for gaming use. They also are priced exorbitantly compared to their 2.5x-3.5x PCI-Express slots desktop counterparts.
What's funny about this is that it sure seems like they took gaming GPUs and repackaged them to use in servers, rather than the other way around, like you seem to believe.I am mad because they are not producing different products for different segments but instead use the "one size fits all" design approach because that's cheaper for them
The iGPUs in Ryzen 7000 chiplet products are of pretty negligible size.Those iGPUs are even worse -- they take CPU die space making CPUs more expensive.
iGPUs are fine for non-gaming purposes, which is what I use them for. I like not needing a dGPU in such systems.They also eat into CPU power and cooling budget while they aren't capable of even proper 1080p gaming.
Oh, and you want to complain about iGPUs?What I was saying is that for example Mac Mini M2 is a sensibly designed machine