News Intel Reportedly Kills AVX-512 on Alder Lake CPUs

WCCFTech's article speculates the main reason is that the forthcoming non-K series CPUs, which lack the efficiency cores, will have been able to be used on workstations and lower end servers instead of more expensive parts.

Intel, kings of artificial segmentation.
 

Tech0000

Reputable
Jan 30, 2021
24
25
4,540
if it's only a CPU uCode update intel is doing to disable AVX-512, then this it will an excellent opportunity to mod every new bios that comes out (using UBU and other excellent tools available over at the win-raid site forum) with older working uCodes (with AVX-512 enabled). The cat is out of the bag and Intel cannot put it back in.

The whole idea to disable AVX-512, now that Intel finally got it work as it should (power efficient and performance effective), is bizarre.

Are they thinking to artificially segment AVX-512 into higher end SKU's to create more differentiation between HEDT/Xeon and Core i#? Nuts!

This will only piss off enthusiasts and professionals and drive them towards AMD. Competition is a great thing!
 
  • Like
Reactions: Kamilf

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Why they do this?

Because scummy company will be scummy.

I love how both nvidia and intel prove me right about them over and over again. It's like a contest of who is the scummiest, nvidia or intel... well, intel is not far off, but the crown goes to nvidia for sure.

P.S. I'm giving Intel Arc the benefit of the doubt and I think since it's their 1st serious entering in discrete GPU market, I think they will be good bois - at least this time around, for the 1st generation - to make a good impression. But I would not put past them to be scummy from 2nd gen onward with their GPUs too...
Their CPUs though, they can shove them where the sun don't shine, and nvidia can do the same with their GPUs.
 
  • Like
Reactions: Jim90 and Kamilf
Nov 23, 2021
9
7
15
Intel doing Intel things.

Decreasing performance post launch craze benchmarks for whatever reasons to prepare space for newer generations to stand out more in future benchmarks seems like a thing to look out for...
 

JayNor

Honorable
May 31, 2019
442
97
10,760
Since Intel made the decision to exclude avx512 on Alder Lake, they likely also excluded testing avx512. That alone would be reason to tell the OEMs not to support it.

Yeah, there is also the possibility they removed the avx512 from the non-K or laptop chip layouts, since they are also making room for the larger GPU. They might also just power it down, if they are trying to extend laptop battery life as a priority.

I've seen a rumor that Intel is working on a 128 E-Core Sierra Forest-AP CPU that will used a beefed up Gracemont successor that adds avx512. Perhaps Intel just wants to wait until that generation of Atom cores is implemented so they can enable avx512 across all cores.

It will be interesting to see how AMD handles these issues with zen4 chips. Will they enable avx512 on laptop and risk dealing with extra thermal issues? They've excluded their pcie4 feature on the laptop chips, so it would be consistent for them to continue to limit laptop chips to avx2 operation.

But, we're forgetting probably the real reason for disabling avx512, which was that Linus didn't like it.
 
Nov 23, 2021
9
7
15
Since Intel made the decision to exclude avx512 on Alder Lake
It's not about excluding, it's about force disabling it.

It's blatant Intel benchmark fixing and market segmentation.

They don't want to support it? Push a toggle with a disclaimer. Outright disabling? Shame on you, Intel.
 
Last edited:
Nov 23, 2021
9
7
15
None of us know the entire story. Maybe they never tested it or maybe they even know there is a flaw.
As i said, toggle and disclaimer. Lot's of performance can be gained even with software workaround.
Maybe they plan on removing it from the silicon in the future and don't want the confusion of different versions.
Wow. Dude. You Intel shill or smth? If car maker pushed an upgrade removing 7th gear from operation in your gearbox "because in newer iteration there will be no 7th gear" that's class action suit in the making and not "they don't want the confusion of different versions.".

Intel pushes ton of SKUs anyway and is perfectly fine with nonsensical 12345xyz134 naming scheme so i'm sure there's plenty of naming space to differentiate.

Even if it is market segmentation... that has been going on for 30 years. Remember SX 386 and 486 CPUs. Or early Celerons.
So? Any reason to cut slack? They are cutting features post-launch. Post benchmarking. Post evaluation. Outside of many customers scope of comprehension. It's hostile anti-consumer move and must be held in light as such.

Heck, all of us expected AVX-512 to be fused off in Alder Lake until about 2 months.
Of course, but not in a "yeah, that makes sense" fashion, but in "yeah, as expected from Intel".
 

PCWarrior

Distinguished
May 20, 2013
215
100
18,670
OK let’s set the record straight.
  1. To use AVX512 on Alderlake you had to disable the e-cores. Although you get a nice speedup with AVX-512 when it is being utilised, using the 8 e-cores on the 12900K almost achieves the same speedup. On y-cruncher the stock configuration (8P+8E cores) completes the benchmark in 28 seconds while the 8P cores with AVX-512 takes 24.64 seconds. So yeah, you lose a 13.6% speedup but it’s not really such a big loss.
  2. The functionality of AVX512 on Alderlake is not being tested/validated by Intel. So, the results you get by using AVX-512 on Alderlake may be incorrect anyway. Nobody serious about using AVX-512 should trust these results. Also as it is not being tested by Intel, we don’t know whether the AVX512 units on Alderlake chips are actually fully functional on 100% of the chips. Especially on lower end skus. Probably they are fine on an i9 12900K but what about a 12400F?
  3. Intel never advertised or promised AVX-512 support on Alderlake and they were very specific that AVX-512 is not supported whenever they were explicitly asked about it. They never posted benchmarks with AVX-512 either. Just because some motherboard vendors went rogue and enabled the enabling of AVX-512 on their motherboards, it doesn’t mean that it is being supported or promised or advertised by Intel.
  4. Support for AVX512 on motherboards of specific vendors actually created an unfair competitive advantage for the motherboard vendors that went rogue (ASUS/Gigabyte/ASRock) versus those (MSI/EVGA/Biostar/Supermicro) who didn’t. So those who did the right thing and followed Intel rules found themselves in a competitive disadvantage. Why should Intel reward rogue motherboard vendors? Same story as some once happened with overclocking non-K and Xeon chips by completely unlocking BCLK overclocking.
  5. I remember people were very critical about motherboard vendors not following Intel spec/guideline and disabling power limits by default. Their argument was that it might affect system stability. But here we are same people defending an even worse practice that’s totally outside what Intel tests/validates and which might cause system crashes or incorrect results or might introduce security issues.
  6. AVX-512 draws a lot of power. That is not a problem for overengineered Z690 boards but what about the lower-end B-series or H-series motherboards? Especially the cheap ones.
  7. Artificial market segmentation by disabling a feature happens all the time. AMD and NVidia do it as well. How do you thing a Ryzen 5 or a Ryzen 3 or an RX6800 or an RTX 3080 is made? By disabling fully functional cores that are already there on the full chip. Perhaps in 5-10% of the chips these cores are not working properly but in 90-95% they are perfectly working cores that are just fused off to artificially create a lower-end product. And how about the Quadro/A-series/Radeon pro cards? That (apart from ECC) are exactly the same as the gaming cards but are sold manyfold times the price just so you get the drivers that speedup some professional software?
  8. The AMD fbs of course have found an opportunity to attack Intel. Ignoring that AVX-512 doesn’t exist at all on AMD cpus and that themselves have often criticised its usefulness and downplayed its importance. But now lo and behold how dare Intel “remove” a feature.
 
Nov 23, 2021
9
7
15
OK let’s set the record straight.

Ad. 1 - That's up for customer to decide.
Ad. 2 - That's up for customer to decide.
Ad. 3 - That's up for customer to validate for his use case and decide.
Ad. 4 - Wow. Just wow. "unfair competitive advantage ".
Ad. 5 - As i said multiple times, turn off by default, and have toggle display disclaimer.
Ad. 6 - That's up for customer to decide.
Ad. 7 - It's post launch, dude. Noticed the difference?
Ad. 8 - Considering internet demographics, there's a high chance i've been using Intel CPUs exclusively for longer than you live. Don't call me AMD fb.

You paint pretty nice picture of what you consider fine. There's only one thing i point continuously - you DON'T remove features from stuff you have ALREADY sold. No matter how you paint it, now matter how you spin it.

The moment it's sold, you can only fix it, make it better or if it's required by law or otherwise security risk - offer recall and reimburse. Forcing customers to avoid security updates when they want to keep using the product in a valid (even if not advertised or intended) way is a non-redeemable.
 

PCWarrior

Distinguished
May 20, 2013
215
100
18,670
Ad. 1 - That's up for customer to decide.
Ad. 2 - That's up for customer to decide.
Ad. 3 - That's up for customer to validate for his use case and decide.
Ad. 4 - Wow. Just wow. "unfair competitive advantage ".
Ad. 5 - As i said multiple times, turn off by default, and have toggle display disclaimer.
Ad. 6 - That's up for customer to decide.
Ad. 7 - It's post launch, dude. Noticed the difference?
Ad. 8 - Considering internet demographics, there's a high chance i've been using Intel CPUs exclusively for longer than you live. Don't call me AMD fb.

You paint pretty nice picture of what you consider fine. There's only one thing i point continuously - you DON'T remove features from stuff you have ALREADY sold. No matter how you paint it, now matter how you spin it.

The moment it's sold, you can only fix it, make it better or if it's required by law or otherwise security risk - offer recall and reimburse. Forcing customers to avoid security updates when they want to keep using the product in a valid (even if not advertised or intended) way is a non-redeemable.
  1. Sigh. What you don’t understand from “this is not a feature that Intel has ever advertised or promised”? And AVX-512 support is NOT advertised by the motherboard vendors either (not even those that went rogue and made its enabling possible). For all intents and purposes Intel is removing absolutely nothing post launch.
  2. The customer has decided to buy a chip that does NOT support AVX-512 and at no point have they received a promise that they could use this feature. And the few tech outlets (e.g. Anandtech) that discovered that the feature can be enabled on some motherboards, and went on to post an article/video about it, explained to people that Intel’s position is that this feature is not supported/validated on Alderlake and that it is very likely to be disabled in the future with a BIOS update.
  3. It is up to the user to decide whether to update their BIOS/OS.
  4. Not using a product as intended means that you make a conscious decision to waive your right for support by the manufacturer. Much like if you delid your CPU and damage it in the process (or gets damaged during use) you cannot expect to get it replaced/fixed by warranty.
  5. Not sure what you find so incredulous about the unfair competitive advantage point. I am sure some enthusiasts would be buying ASUS and Gigabyte motherboards for their rogue AVX-512 support instead of MSI. Much like when ASRock allowed BCLK overclocking on non-K chips. Intel doesn’t want to piss off their board partners and OEMs, especially those who stick to the rules as they should.
  6. The customer does not really have the means to properly validate a feature. They may validate one example but that means jack nothing. And of course, you cannot trust that they would even put the effort to do so. The last thing you want is a bunch of ignorant morons badmouthing a feature based on their “bad experience” creating clickbait headlines all over the internet and their experience turns out to be on unsupported hardware. Even then some would then have the cheek to accuse the manufacturer for supposedly not communicating it well to them.
  7. When something is validated is also validated for warranty purposes. Intel cannot guarantee that running AVX-512 on Alderlake (even if the AVX-512 units are functioning and the results are correct) that it won’t lead to the chip failing prematurely before the 3-year warranty period. Last thing you want is a bunch of people damaging their hardware and then casting doubt over a brand’s reliability.
  8. I didn’t call you specifically an AMD fanboy. My comment was general and referred to actions and comments posted by AMD fanboys. You took offence by it as you identified yourself as such by your actions.
 
Nov 23, 2021
9
7
15
You're pretty invested in whitewashing anti-consumer measures. You Intel shareholder or smth?

No amount of wall of text will change the fact that degrading performance/feature set post launch can only earn scorn. It doesn't matter what it said on the box. It was already sold. It was tested, people made their decisions and paid money already.

That's so blatant bait and switch it's beyond me how sane person could defend that.
 

daviangel

Distinguished
Jul 17, 2006
4
2
18,515
  1. Sigh. What you don’t understand from “this is not a feature that Intel has ever advertised or promised”? And AVX-512 support is NOT advertised by the motherboard vendors either (not even those that went rogue and made its enabling possible). For all intents and purposes Intel is removing absolutely nothing post launch.
  2. The customer has decided to buy a chip that does NOT support AVX-512 and at no point have they received a promise that they could use this feature. And the few tech outlets (e.g. Anandtech) that discovered that the feature can be enabled on some motherboards, and went on to post an article/video about it, explained to people that Intel’s position is that this feature is not supported/validated on Alderlake and that it is very likely to be disabled in the future with a BIOS update.
  3. It is up to the user to decide whether to update their BIOS/OS.
  4. Not using a product as intended means that you make a conscious decision to waive your right for support by the manufacturer. Much like if you delid your CPU and damage it in the process (or gets damaged during use) you cannot expect to get it replaced/fixed by warranty.
  5. Not sure what you find so incredulous about the unfair competitive advantage point. I am sure some enthusiasts would be buying ASUS and Gigabyte motherboards for their rogue AVX-512 support instead of MSI. Much like when ASRock allowed BCLK overclocking on non-K chips. Intel doesn’t want to piss off their board partners and OEMs, especially those who stick to the rules as they should.
  6. The customer does not really have the means to properly validate a feature. They may validate one example but that means jack nothing. And of course, you cannot trust that they would even put the effort to do so. The last thing you want is a bunch of ignorant morons badmouthing a feature based on their “bad experience” creating clickbait headlines all over the internet and their experience turns out to be on unsupported hardware. Even then some would then have the cheek to accuse the manufacturer for supposedly not communicating it well to them.
  7. When something is validated is also validated for warranty purposes. Intel cannot guarantee that running AVX-512 on Alderlake (even if the AVX-512 units are functioning and the results are correct) that it won’t lead to the chip failing prematurely before the 3-year warranty period. Last thing you want is a bunch of people damaging their hardware and then casting doubt over a brand’s reliability.
  8. I didn’t call you specifically an AMD fanboy. My comment was general and referred to actions and comments posted by AMD fanboys. You took offence by it as you identified yourself as such by your actions.
Yeah it's already disable at instruct set level so not losing anything.
"
To simplify the programming model and provide flexibility, the following design decisions were made on the instruction set level:
● All core types have the same instruction set.
● AVX512 is disabled on P-cores and not available on E-cores.
https://www.intel.com/content/www/u...h-gen-intel-core-processor-gamedev-guide.html

Deprecated Technologies
The processor has deprecated the following technologies and they are no longer supported:
  • Intel® Memory Protection Extensions (Intel® MPX)
  • Branch Monitoring Counters
  • Hardware Lock Elision (HLE), part of Intel® TSX-NI
  • Intel® Software Guard Extensions (Intel® SGX)
  • Intel® TSX-NI
  • Power Aware Interrupt Routing (PAIR)
Processor Lines that support Intel's Performance Hybrid Architecture do not support the following:
  • Intel® Advanced Vector Extensions 512 Bit

This info was all in the documentation way before you could even buy the hardware so it's not like they were hiding anything or switched anything last minute.
Personally, more annoying is the loss of Intel SGX since no more playing 4k Blu-ray disc with Windows , but then again how many people are doing that?
https://www.cyberlink.com/support/faq-content.do?id=26690
 
Last edited:
  • Like
Reactions: DavidMV and Why_Me