News Intel is looking into CPU overhead associated with Arc GPUs on older chips

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I'm not trying to rip on old AMD CPUs so much as trying to show that HWUs arguments against ARC is so cherry picked as to be nonsense that can be used to prove other points. Sure the 2600 did worse in games than the i3 10100 (quad core Skylake arch) but I'm sure it averages far better in games with a B580 than the worst 5 games in the worst scenarios HWU can reasonably pick.

The other end of the spectrum, on the GPU limited side, GN found little effective difference in switching CPUs with GPUs on the level of the B580.

I think most people are more likely to run into GPU limitations when using an entry level GPU, and if they have a barely compatible CPU with W11 they are probably aware that it will hold back their system relative to the fastest CPUs out there in some situations. HWU is trying to pretend that these users will see HWU's results averaged across all the games they play the way they play them and that is a lie.

The elephant in the room IMO is that DLSS, XeSS and FSR4 are so much better than FSR1,2,3 in image quality that it is like comparing FXAA to SSAA, but DLSS, XeSS and FSR4 can be compared to native, and the industry standard in tech reporting is to just brush this disparity under the rug. With Nvidia there are hundreds: https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/ , XeSS 171: https://game.intel.com/us/xess-enabled-games/ and FSR4 33:
https://www.pcguide.com/news/fsr-4-games-list-a-complete-list-of-all-the-latest-supported-titles/ modern, popular games where every dGPU that can't use these has an effective 25% performance disadvantage. In these games a B580 performs like a 7700XT or 6800, for example.

Almost anybody that has an Nvidia or Intel GPU that is at the point of choosing graphics settings for best image/fps is going to use these and the whole tech reporting industry has been ignoring this in benchmark rankings. Pretending these settings don't exist. Smaller outfits that are basically one guy are not going to tip over the apple cart and a lot of the big guys want no part in controversy. But big outfits some do sell controversy like HWU and GN and why they have been effectively lying to us by omission for years is starting to get suspect. I'm curious of how they will handle this now that AMD is starting to get a fraction of the benefits that Nvidia and Intel already have on this. Will they drop the RX 6000, 7000 series cards to their appropriate performance levels? I doubt it, they will probably just bring it up as a selling point as for why the RX9000 series is better than everything else.

I'm not either just further pointing out that Sam is badly implemented which makes that CPU hobble worse.
 
  • Like
Reactions: rluker5
There, fixed it for you.
Unless somebody is watching hub or gn or whatever they won't even notice, they will still have a good time with the card.
Oh believe me, they will notice that their GPU seems to age faster than they are used to GPU's aging, then they might realize what's going on when they investigate.... and most likely will be angry at people like you who try to give cover for manufacturers this way.

It's rather difficult to have a "good time" when the game behaves erratically for occult reasons.

Someone with a low end card/cpu isn't really in the position to play every AAA game right away, or ever, anyway, they have cheap equipment for a reason.
Actually they are, just not at highest settings. AAA requirements are overstressed for the most part.

With a 1050ti you can play almost everything released up to last year at lowest settings, it won't look great but it runs fine. It's only with UE5 games that you have significant trouble and a lot of that has to do with how infamously unoptimised these games tend to be because of how heavily the devs lean on upscaling.

I actually got into an argument at one time about Starfield where the "genius" counter to my crituque was "it won't look good on your PC so you won't enjoy it... so any criticisim you have will be irrelevant"..... I don't understand that kind of superficial thinking but to each his own I guess.

A common problem is VRAM requirement but that too used to not be a problem so people don't know to look out for them and almost no requirements blobs even mention them anyway.... so you get to points where games crash because they ran out of VRAM.... and this never used to be an issue they would just run slower.

It's really odd to me how some people want to make AAA+ some kind of badge of social status rather than a badge of how much money was spent in production. "I spent so much more money so I can play AAAAA games!".... well that's great for you but how does that make you special? This is basically the PC version of overcompensating Ferrari drivers.... the drivers think the world of themselves and the rest of the world doesn't actually care.

It seems like your argument is the tier of GPU is linked to the games that will run.... not how well they will run. This has never been the case and if one day it is a LOT of people will simply loose interest in gaming past a certain fidelity point because it will have perpetually become completely unaffordable. This is not 4K gaming nor is it VR where such requirements make sense.... this is just AAA and it does not HAVE to look the best it can look it just has to run playably.

The problems Intel are facing has nothing to do with gaming addicts getting high and feeling good pretending to be other people and getting sucked into false realities.... it's about dishonesty and stumbling costing the consumer money that could have been better spent. No one has a problem with people harming themselves with unwise purchaces.... the problem is when they flippently try to make other people do it.
 
Intel gpus are t really like standard GPUs were used to like and and Nvidia there a bit weird in that regard. They need Sam to work properly.
Yes but the common consumer won't realize this.... or even know what SAM is. Then when you try to explain to them why the GPU won't work for them at best you will confuse them and they will react negatively in some way either to you or to Intel.

It's as if people never learn from the past, Intel is playing with fire here and it had better play better because it went from invincible in the CPU arena to untrusted and it's GPU's are not as good so leaving Nvidia out of it a lot of people that used to trust them have moved completely over to AMD and with every blunder more are doing so.

The only thing that Intel GPU's are the good choice for atm are a combo of high end current gen CPU with full win11 requirements met and rebar enabled and video workload use case. For everything else it's just not where it should be. They simply assume too much and continutally pay the price for their hubris. Their drivers for older games also still have problems even though with a few hacks they have substantially improved.... I think if they keep improving the way they have after 10 years this wont matter anymore but at the CURRENT moment it does. The question is.... can they hold on for that long given their blundering and internal problems that sometimes briefly surface.
 
Oh believe me, they will notice that their GPU seems to age faster than they are used to GPU's aging, then they might realize what's going on when they investigate.... and most likely will be angry at people like you who try to give cover for manufacturers this way.
How is it going to age faster?!
The games that don't play well don't play well already, so from the start.
There is a possibility that it might be fixed in drivers at some point, there is no possibility that drivers are going to make it worse in the long run.
Actually they are, just not at highest settings. AAA requirements are overstressed for the most part.

With a 1050ti you can play almost everything released up to last year at lowest settings, it won't look great but it runs fine. It's only with UE5 games that you have significant trouble and a lot of that has to do with how infamously unoptimised these games tend to be because of how heavily the devs lean on upscaling.

I actually got into an argument at one time about Starfield where the "genius" counter to my crituque was "it won't look good on your PC so you won't enjoy it... so any criticisim you have will be irrelevant"..... I don't understand that kind of superficial thinking but to each his own I guess.

A common problem is VRAM requirement but that too used to not be a problem so people don't know to look out for them and almost no requirements blobs even mention them anyway.... so you get to points where games crash because they ran out of VRAM.... and this never used to be an issue they would just run slower.

It's really odd to me how some people want to make AAA+ some kind of badge of social status rather than a badge of how much money was spent in production. "I spent so much more money so I can play AAAAA games!".... well that's great for you but how does that make you special? This is basically the PC version of overcompensating Ferrari drivers.... the drivers think the world of themselves and the rest of the world doesn't actually care.

It seems like your argument is the tier of GPU is linked to the games that will run.... not how well they will run. This has never been the case and if one day it is a LOT of people will simply loose interest in gaming past a certain fidelity point because it will have perpetually become completely unaffordable. This is not 4K gaming nor is it VR where such requirements make sense.... this is just AAA and it does not HAVE to look the best it can look it just has to run playably.

The problems Intel are facing has nothing to do with gaming addicts getting high and feeling good pretending to be other people and getting sucked into false realities.... it's about dishonesty and stumbling costing the consumer money that could have been better spent. No one has a problem with people harming themselves with unwise purchaces.... the problem is when they flippently try to make other people do it.
I was just saying that people don't have the money to spend to buy all of the AAA titles...If you can only afford an intel card and/or a very old CPU and that intel card/cpu doesn't play some games well you are just not going to buy those games and go on live your life.

And I had a 1050ti until early last year, "fine" isn't the way I would word it, it would play anything but the experience often was bad even when lowering to 720.
 
Yes but the common consumer won't realize this.... or even know what SAM is. Then when you try to explain to them why the GPU won't work for them at best you will confuse them and they will react negatively in some way either to you or to Intel.

It's as if people never learn from the past, Intel is playing with fire here and it had better play better because it went from invincible in the CPU arena to untrusted and it's GPU's are not as good so leaving Nvidia out of it a lot of people that used to trust them have moved completely over to AMD and with every blunder more are doing so.

The only thing that Intel GPU's are the good choice for atm are a combo of high end current gen CPU with full win11 requirements met and rebar enabled and video workload use case. For everything else it's just not where it should be. They simply assume too much and continutally pay the price for their hubris. Their drivers for older games also still have problems even though with a few hacks they have substantially improved.... I think if they keep improving the way they have after 10 years this wont matter anymore but at the CURRENT moment it does. The question is.... can they hold on for that long given their blundering and internal problems that sometimes briefly surface.

While agree with some points in it should just be stated on the selling pages that rebar is required and on the box period. I never seen anything about rebar required on sites unless they have updated it. As you said average Joe isn't going to look to hard into it but that's because you have CEOs that have nothing to do with tech in charge of a tech company it's complete brain dead corporate move.

I own 2 of these cards and they work as intended though.
 
  • Like
Reactions: rambo919
How is it going to age faster?!
The games that don't play well don't play well already, so from the start.
There is a possibility that it might be fixed in drivers at some point, there is no possibility that drivers are going to make it worse in the long run.
Uhm.... no one buys GPU's that way. Any GPU you buy already plays everything available (at highest settings) at the time of purchase and will do so for at least 5 years afterwards. Low end users don't play above 1080p.... all that will happen is they will have to progressively lower their settings as the years go on.... and there will be a decernible rate in that. People that upgrade often don't do this.... low end users don't upgrade often.

I was just saying that people don't have the money to spend to buy all of the AAA titles...If you can only afford an intel card and/or a very old CPU and that intel card/cpu doesn't play some games well you are just not going to buy those games and go on live your life.
Ah.... but are low end users prone to paying full price or even pay at all for the games they play? You might want to recheck some assumptions.

And I had a 1050ti until early last year, "fine" isn't the way I would word it, it would play anything but the experience often was bad even when lowering to 720.
I doubt anything at 720p when the game is made for 2-4k in mind will look good.... the worst experience I had was because of bad upscaling being reversed to downscaling.... this also happened almost exclusively on Unreal Engine games. A bizarre kind of drunken blurryness I had never before experience even with the first 3D games, upscaling changed the paradigm in unexpected ways which also is causing increased consumer hostility while no one is noticing.
 
While agree with some points in it should just be stated on the selling pages that rebar is required and on the box period. I never seen anything about rebar required on sites unless they have updated it. As you said average Joe isn't going to look to hard into it but that's because you have CEOs that have nothing to do with tech in charge of a tech company it's complete brain dead corporate move.

I own 2 of these cards and they work as intended though.
The only time I actually saw any direct reference to rebar in any oficial media I can remember was when there was that slight karfuffle with Starfield for a week or so. Then it was mentioned in a handful of GPU reviews.... and then everyone just started ignoring it again.
 
Intel gpus are t really like standard GPUs were used to like and and Nvidia there a bit weird in that regard. They need Sam to work properly.
AMD cards used o get a 3% performance boost using SAM and HWU had a bunch of info on it because there was a comment section uproar when they didn't have it enabled in some of their testing.
The only time I actually saw any direct reference to rebar in any oficial media I can remember was when there was that slight karfuffle with Starfield for a week or so. Then it was mentioned in a handful of GPU reviews.... and then everyone just started ignoring it again.
I think I hear about rebar being needed in every single video about Arc GPUs. Probably is hard to find a video where they don't mention it somewhere. Also back in the earlier days of PCIe when it was going to gen1 to gen 2,3 PCIe version mattered for regular gaming and SLI/ CFX. And the number of lanes also mattered. A couple of years back AMD released a GPU with 4 PCIe lanes and it did very poorly in PCIe gen 3 systems which were very common at the time.

(PCIe) hardware standards advance and hardware uses those advancements. This is how it has always been with PCs.
 
  • Like
Reactions: gschoen
The problem with this reasoning is I cannot remember any time previously where a GPU had hard minimum CPU specific requirements for anything.

This kind of thing is relatively new for the end user so it tends to cast a large shadow every time it pops up.

On the other hand, most people going by any statistic only use features like rebar or upscaling if they are automatically enabled and most people don't need them because they run 1080p. GN and HWU have fallen into a bubble of high end hardware and their struggle tends to be understanding the common user because they themselves are so used to "the best" they get frustrated that it's not "the average".

However..... what they do kind of understand here is the problems that low end users have started to face. The low end user is much more likely to run into problems these days because they are more likely to have older CPU's that are coupled with what we are being gaslit into believing is low end GPU's..... rather than hobbled mid range GPU's because the high end is rebranded luxury enthusiast and mostly actually experimental.

What this means is on a lot of PC's Intel GPU's will have lower performance and the end user won't understand why unless it's specifically pointed out to him. Most people don't upgrade their CPU's unless they for some reason need to upgrade their motherboards.... I myself have never did a CPU or even RAM upgrade it's always been a complete platform one.

And herein lies the catch people have been falling into.... it's not just PCIe speed that effectively matters for GPU's anymore it's gotten much more complicated.
DX12 needs W10 and that has CPU requirements. A whole lot of features just won't work if you can only run DX11. FSR4 needs W11 for some of it's supported games to run and that has CPU requirements.

You probably can't remember because nobody makes a fuss over it because it is normal, except in the case of Intel GPUs. They are different, not part of the old club.
 
DX12 needs W10 and that has CPU requirements. A whole lot of features just won't work if you can only run DX11. FSR4 needs W11 for some of it's supported games to run and that has CPU requirements.

You probably can't remember because nobody makes a fuss over it because it is normal, except in the case of Intel GPUs. They are different, not part of the old club.
That's a requirement for software, not for hardware.... .and it's only directly applicable to Windows.

Not what I was referring to at all. You can easily fix software requirements.... not so much hardware requirements. You can even with a bit of effort completely ignore Win11 requirements and still have a working Win11 system.... and it actually gets fastuh in the process sometimes, you cannot do the same thing with hardware.
 
Uhm.... no one buys GPU's that way. Any GPU you buy already plays everything available (at highest settings) at the time of purchase and will do so for at least 5 years afterwards. Low end users don't play above 1080p.... all that will happen is they will have to progressively lower their settings as the years go on.... and there will be a decernible rate in that. People that upgrade often don't do this.... low end users don't upgrade often.
The only games intel gpus are much worse at are the 5 that hub shows, when you have an ancient cpu like the ryzen 2600.
For everything else intel gpus are going to age exactly like any other gpu.
And someone with an 2600 is much more likely to upgrade their CPU during the lifetime of the GPU.
Ah.... but are low end users prone to paying full price or even pay at all for the games they play? You might want to recheck some assumptions.
That's my point, they are either going to wait for a deep discount which will at least increase their chances of fixes having been released, or them having upgraded the cpu, or they will not pay for games at which point their highest worries are going to be loss of performance from mining viruses and not from having an intel gpu.
That's a requirement for software, not for hardware.... .and it's only directly applicable to Windows.

Not what I was referring to at all. You can easily fix software requirements.... not so much hardware requirements. You can even with a bit of effort completely ignore Win11 requirements and still have a working Win11 system.... and it actually gets fastuh in the process sometimes, you cannot do the same thing with hardware.
At the moment we have no idea if this is a hardware issue, since it only happens in 5 games it could easily be an issue with those games.
 
  • Like
Reactions: rluker5
The only games intel gpus are much worse at are the 5 that hub shows, when you have an ancient cpu like the ryzen 2600.
For everything else intel gpus are going to age exactly like any other gpu.
And someone with an 2600 is much more likely to upgrade their CPU during the lifetime of the GPU.
If you actually think it's so old it needs to be replaced just because of it's age.... you are not understanding the problem. You also probably think most people upgrade CPU's every 2-3 years.... have you ever considered you might live in a bubble? Most people actually upgrade both CPU's and GPU's at roughly the same periods.... they don't upgrade their CPU's more than they do their GPU's, often the CPU's stay much longer than the GPU's and it's closer to 7-10 years apart.

It's only the mind-high end speed chasers that upgrade multiple CPU's in the same motherboard. There are still a LOT of people with older CPU's including myself with a intel 8700.... so if I now get an Intel GPU I would probably have this problem. 2 years into the future I might need to start thinking about an upgrade but currently my 8 year old CPU and mobo with still PCIe 3 is still doing everything it needs to.... if it aint broke don't fix it if the fixing costs money.

And guess what, with my 7700xt AMD GPU I won't be having the problems I likely would have had if I went the Intel route.

That's my point, they are either going to wait for a deep discount which will at least increase their chances of fixes having been released, or them having upgraded the cpu, or they will not pay for games at which point their highest worries are going to be loss of performance from mining viruses and not from having an intel gpu.
See above.... CPU upgrades are generally not as frequent as you assume and often less frequent than GPU upgrades.

At the moment we have no idea if this is a hardware issue, since it only happens in 5 games it could easily be an issue with those games.
When it comes to performance issues like this where there is smoke there is fire.... and a lot more games could be affected and the excuse would be "they are old and our drivers need to mature more", my guess is it has to do with feature sets of CPU's. Something like AVX2 not being supported having caught people in the last 5 years as example..... Intel is at best assuming whatever is not present is present so they could have been surprised by this. I doubt it would be software related if it goes away just by having a faster newer CPU.

Now it's possible that there is a driver fix for this.... but given that it was not fixed quickly I doubt it's an easy thing to do.

It is unfortunately in their best interest to just ignore it and hope it goes away as people upgrade because they generally I think assume a 2 year upgrade cycle for consumers the same as happens in enterprise. But then those damn dirty reviewers noticed and they did not get away with it. Intel forgets people are a lot less forgiving if it hurts their wallets.... then they remember like elephants.... even if they don't remember what exactly went wrong they remember how badly it hurt.
 
If you actually think it's so old it needs to be replaced just because of it's age.... you are not understanding the problem. You also probably think most people upgrade CPU's every 2-3 years.... have you ever considered you might live in a bubble? Most people actually upgrade both CPU's and GPU's at roughly the same periods.... they don't upgrade their CPU's more than they do their GPU's, often the CPU's stay much longer than the GPU's and it's closer to 7-10 years apart.
Take a really wild guess on when the 2600x came out...... (april 2018)
Edit: year correction
 
Last edited:
  • Like
Reactions: rambo919
That's a requirement for software, not for hardware.... .and it's only directly applicable to Windows.

Not what I was referring to at all. You can easily fix software requirements.... not so much hardware requirements. You can even with a bit of effort completely ignore Win11 requirements and still have a working Win11 system.... and it actually gets fastuh in the process sometimes, you cannot do the same thing with hardware.
Rebar is theoretically available in all PCIe gen 3 systems and they generally just need firmware to do that, which is more difficult than bypassing W11 requirements, but not harder than bypassing W10 requirements AFAIK. There are far more games that will be completely unplayable on CPUs that don't support W10 than systems that don't support rebar with an Arc GPU for the vast majority of bargain users. If someone is broke they don't have the time to change hardware requirements for software and firmware and write their own linux patches that crash and lock up a lot of the time. The number of bargain users that have rebar far outnumbers the number of proficient linux users.

And the rebar limitation is not a lock from using hardware. You can still use an Arc card with older systems, it just suffers more performance loss than AMD which suffers more performance loss than Nvidia. If you have an old Phenom IIx6 that was released in 2010 you haven't been able to play the latest games at all on AMD or Nvidia hardware for the last 10 years. That CPU was cut off from the newest games on AMD and Nvidia GPUs 5 years after it was released while on Arc GPUs 7 year old CPUs are just suffering from reduced performance.
 
Last edited:
Rebar is theoretically available in all PCIe gen 3 systemssnip
Perhaps I should rephrase. It's not rebar itself or win11 itself that is the issue... it's that you need to disable CSM to enable it. And the whole UEFI booting thing is great in theory but in practice especially on older mobo's it can be a major PITA so a lot of people just never use it.... especially secure boot which caused a LOT of confusion early on with how complicated it made installing more than one OS on the same PC at a time.

Personally I have never tried UEFI again and don't feel like reinstalling my Windows and two Linux installs in case the conversion from CSM goes wrong.... so I never enabled rebar and to date have never actually needed it's sometimes sometimes not 10% improvement. Also the amount of things that can potentially go wrong with certificate management dissuades me in itself. I have had too much experience with failure to boot with past Windows versions as well as Windows making Linux installs unbootable to willingly place my head in the beartrap of allowing M$ to control my booting.

So it's possible it's better on newer mobos and I just wont know but at the same time.... something in me fundamentally rebels at even the thought of M$ controlling whether or not my OS can boot even if it's not even made by M$.

So it will do a great deal of pressure to make me bend the knee to the requirements of Win11 that are actually not for Windows per se but for certain unnecessary features that they want to force me to use. A case of "come over here and make me".

Point is a lot of people either don't have the expected hardware anyway or wont use it as desired even if they do.... in the case of my CPU everything other than it's age weighs in it's favour..... that floating moving "minimum" requirement that is perpetually subject to change based on the arbitrary whims of PR departments. I can actually install Win11 as required but even if I play ball there is no guarantee that I will stay "supported".

There is a niche for Intel GPU's.... and most people are not in it. Trying to deflect from this does no one any favours... it just looks dishonest. Now Intel could develop themselves out of this niche.... but they have not YET done this and it's understandable why.... but don't expect people to not react negatively to attempts to screw them over with dishonesty.
 
Perhaps I should rephrase. It's not rebar itself or win11 itself that is the issue... it's that you need to disable CSM to enable it. And the whole UEFI booting thing is great in theory but in practice especially on older mobo's it can be a major PITA so a lot of people just never use it.... especially secure boot which caused a LOT of confusion early on with how complicated it made installing more than one OS on the same PC at a time.

Personally I have never tried UEFI again and don't feel like reinstalling my Windows and two Linux installs in case the conversion from CSM goes wrong.... so I never enabled rebar and to date have never actually needed it's sometimes sometimes not 10% improvement. Also the amount of things that can potentially go wrong with certificate management dissuades me in itself. I have had too much experience with failure to boot with past Windows versions as well as Windows making Linux installs unbootable to willingly place my head in the beartrap of allowing M$ to control my booting.

So it's possible it's better on newer mobos and I just wont know but at the same time.... something in me fundamentally rebels at even the thought of M$ controlling whether or not my OS can boot even if it's not even made by M$.

So it will do a great deal of pressure to make me bend the knee to the requirements of Win11 that are actually not for Windows per se but for certain unnecessary features that they want to force me to use. A case of "come over here and make me".

Point is a lot of people either don't have the expected hardware anyway or wont use it as desired even if they do.... in the case of my CPU everything other than it's age weighs in it's favour..... that floating moving "minimum" requirement that is perpetually subject to change based on the arbitrary whims of PR departments. I can actually install Win11 as required but even if I play ball there is no guarantee that I will stay "supported".

There is a niche for Intel GPU's.... and most people are not in it. Trying to deflect from this does no one any favours... it just looks dishonest. Now Intel could develop themselves out of this niche.... but they have not YET done this and it's understandable why.... but don't expect people to not react negatively to attempts to screw them over with dishonesty.
I was just looking into it and W10 has dropped support for Haswell and earlier processors. Maybe similar time frames for AMD and W10 is ending this year making security an issue for normies using anything earlier than 8th gen Intel and the 2600 for AMD. (I wonder if fTPM was enabled on the 2600 in the HWU video. That could really trash performance relative to the industry norm of having it disabled)

So basically by the end of this year you will need a processor that probably has access to rebar just to have Windows security updates.

The rebar limitation for Intel will no longer be a mainstream limitation.

I understand you not liking Windows obsoleting processors. My daughter is running my old 4770k with 2400c10 ram rig and doesn't want an upgrade because it is close to as fast, for her purposes, as any modern processor. I just replaced the 1.4 liter 4670k mini pc in my office with a 13600kf primarily because I isolate all of my bank dealings to that pc and I wanted it secure. The 4670k was fast enough for office work. My 10 year old travel tablets still browse the web, play videos, and display books and bluetooth audio just fine but are no longer secure. My garage 1.2 liter music/video streamer pc has a 4790s, also obsolete for Windows. My 7700hq/1050ti 17.3" laptop I picked up so my daughter could game while my mom used to watch her, but now resides in my kitchen on the fridge, mostly for recipes is getting phased out.

A whole lot of good stuff is getting obsoleted from Windows. Maybe that old stuff will be switching to Linux, but probably not the tablets. But I'm not going to win a fight against Windows. One Windows update or the next they will probably just silently stop security updates and getting some accounts hacked will probably cost me more than the pcs are worth.

But back to the graphics cards, seeing as how W10 is going away, and rebar support is practically a W11 requirement, as is full FSR4 support, if you are buying a graphics card today, would you care about issues that are going away in 6 months? The number of people that are going to have the oldest supported CPUs are relatively few and anybody upgrading something older because Windows will probably be getting at least a 12400f or 7400f which are far too performant to have anywhere near the dropoff in any game relative to the 9800X3D in B580 class graphics cards.

Changes are coming to pc gaming.
 
A whole lot of good stuff is getting obsoleted from Windows. Maybe that old stuff will be switching to Linux, but probably not the tablets. But I'm not going to win a fight against Windows. One Windows update or the next they will probably just silently stop security updates and getting some accounts hacked will probably cost me more than the pcs are worth.

But back to the graphics cards, seeing as how W10 is going away, and rebar support is practically a W11 requirement, as is full FSR4 support, if you are buying a graphics card today, would you care about issues that are going away in 6 months? The number of people that are going to have the oldest supported CPUs are relatively few and anybody upgrading something older because Windows will probably be getting at least a 12400f or 7400f which are far too performant to have anywhere near the dropoff in any game relative to the 9800X3D in B580 class graphics cards.

Changes are coming to pc gaming.
It is as yet unclear what will actually happen, I think a lot more people will stick to win10 than stuck with win7.... if only out of spite.

There are just too many drawbacks to win11, to the point where only the tech illiterate will use it as is instead of running something like revios or switching to Linux. A common scenario is dualboot Linux and a somewhat isolated win10 for the handful of software that cannot run on Linux.

There is also the difference between theoretical rebar capability and it actually having to be enabled, FSR4.... is still so new and so expensive that unless they successfully backport via 3.5 as they seem to be trying it will take quite a while for it to actually become the norm.
 
It is as yet unclear what will actually happen, I think a lot more people will stick to win10 than stuck with win7.... if only out of spite.

There are just too many drawbacks to win11, to the point where only the tech illiterate will use it as is instead of running something like revios or switching to Linux. A common scenario is dualboot Linux and a somewhat isolated win10 for the handful of software that cannot run on Linux.

There is also the difference between theoretical rebar capability and it actually having to be enabled, FSR4.... is still so new and so expensive that unless they successfully backport via 3.5 as they seem to be trying it will take quite a while for it to actually become the norm.
Most tech normies can't see much difference between W11 and W10. I just talked to a guy less than 2 days ago and he let me know then that he upgraded his pc because his credit card got hacked and heard it may have been from his outdated pc. He didn't have a lot of details to give other than his pc was pretty old so I'm guessing W7. The replacement pc is new so it is W11, but he didn't know. Probably the same jump for him going from W7 to W10 or W11. And the guy definitely goes the name brand prebuilt route.

Most of the world doesn't pay enough attention to care about the differences between 10 and 11. And they will likely upgrade if they see enough news stories about W10 no longer being secure, by default, come fall.

You can do whatever you want as you evidently have enough interest and ability to set up whichever obscure os and boot configuration makes you happy, but that doesn't mean that the plebs will find it worth their while to do the same. I'm going to shop for whatever linux is most convenient and is also secure for my aging systems, but most won't put up with that mess. And I checked with my daughter this morning and she is definitely resisting me upgrading her pc. (and I really want to set her up with a 14400f/ddr4 setup because those are dirt cheap right now for how long they will last)
 
Most tech normies can't see much difference between W11 and W10.
True enough but I think there might be enough antipathy to keep win10 at at least 40-50% for quite a while.... this will be worse than it was with both XP and Win7 because they grew so much more arrogant.