Discussion The main difference with PC's now compared to how it used to be.

rambo919

Great
Sep 21, 2023
55
30
60
I think I figured out what's changed with the HW market, and it both annoys and almost excites me.

I had come to believe that you can actually have all purpose PC's if you curbed your expectations because that's what I got when I did my last upgrade in 2019-2020.

Intel 8700, 32GB RAM, 1050ti. It could with the standards of the time acceptably do everything: VM's 1080p gaming, video transcoding, etc

In the past this was not quite as clear cut, you needed specific CPU's for video tasks, specific RAM for VM's, specific GPU's for image tasks.... whatever else. But even then mostly the differences were not really so stark.

Now though everything is back to where it was and then some: Specific CPU's for gaming, specific CPU's and/or GPU's for video tasks, specific GPU's for non-RTX gaming, specific GPU's for RTX gaming, RAM is just bonkers complicated, some CPU's are bad for the latest games, some CPU's are bad for the older games, and then there is the complicated matrix of specific GPU's for each grade of output resolution that must be matched to CPU's by some other arcane matrix because there are so many more types of CPU than there used to be, you used to be able to get away with OEM PSU's now.... they are basically obsolete even though branded models can cost 3 or more times what OEM models do. And then there finally is the power overhead that just keeps on climbing as power costs keep on climbing as well as most homes not having power infrastructure geared to that level of constant draw.

The problem now in short.... hyper over specialization and EVERYTHING is more expensive than it used to be. If it was not so expensive and frankly completely out of anyone's control. Now throw into the mix the insanity from M$ that lives in an elitist bubble with a new AI religion and over in Linuxland everyone is joining one world cults of one form or another.... it's getting strange and confusing out there.

I think this is the case and realizing it will clear up a lot of confusion regarding what HW to get, on the other hand if you actually want a all rounder you can only do that if you are wealthy, people used to be able to do so on a relative budget and now they are angry they cannot do it anymore but they don't know why exactly they are angry so it comes out as impotent incoherent rage.

The trade offs themselves don't really help, even if you go for the cheapest HW for your main use case.... most likely when the final calculation for result efficient bang is done it's still more expensive than an all rounder used to be. Most people are just not geared for this level of systems integration and there is a lot of market instability which only further aggrivates everyone... something is going to have to give this cannot keep up indefinitely.
 
This reads like you came up with a theory and told yourself a story to make it work.

If an 8700, 32GB of RAM and a 1050ti were a multipurpose system then, the modern equivalent from AMD/Intel/Nvidia would be just as capable in every scenario now, if not more so, across the board.

Unless the goal is well over 100FPS at the kind of settings a Nvidia 50 or 60 (maybe even 70) series card was never capable of, there's no specialization. None. You buy a mid range CPU, you buy a GPU if that matters to you... Practically any RAM will work, so long as we're talking about like, DDR4-3200 or better. Literally any DDR5. Buy something on the QVL list for your board and stop thinking about it, same as always. 99.999% of people that use computers will never know what CAS latency is, and even fewer will ever care. Unless you're running a pre built with one stick of RAM or you've got too many Chrome tabs open on the regular, 32GB of /something/ is all nearly anyone really "needs". Power user, or can't be bothered to close Chrome tabs? Double it, you're done. Yes, pricing is going back up now, but not long ago at least here in the US that would run you like $100.

PSUs haven't changed significantly unless you're on the bleeding edge, same as everything else. Most people weren't buying 1kw+ PSUs back in the 2000s to run tri-SLI 8800GTXs, and most people will be fine with half that now, same as always... and you'll need the best you can get to the tune of $2-300 if you're buying a top end card now. Yes, multi-GPU is gone, you can just get more power in a single card. Nobody's making you buy it, though. If you were ever happy with a 50 series card, stuff like raytracing never mattered to you. Sure, it wasn't "RTX ON", but I mean its equivalent at the time - HairWorks used to be the thing that tanked frame rates and people debated if it was needed or really added anything. Cloth simulation. PhysX. Heck, for a brief moment there if you were a really hardcore bleeding edge PC gamer you had a second, usually weaker GPU to run additional monitors and handle PhysX calculations.

The degree of specialization you're describing just doesn't exist - the "gaming" CPUs aren't soooo much better that the "office" CPUs are suddenly worthless for gaming. Only the most hardcore enthusiast is going to tell you buying anything other than an X3D chip for gaming is crazy... And they're not the ones buying 50 series Nvidia cards. You don't need some kind of decision matrix to make CPU and motherboard choices either. If you're going Intel you get their current socket and either a budget chip or the fastest your priorities allow. If AMD and budget isn't your first and only real concern, AM5. Buy the board with the features you need/want. Same as always.

Buying "slow" RAM loses you a few FPS, it's almost never the difference between something being playable and not. A 500w PSU will still run a low-mid system like you're describing. Everything is vastly more power efficient than it has ever been. Literally everything.

Yes, pricing is terrible, but if you account for inflation (admittedly insane, but that's a "the world is broken" kind of problem) they haven't changed that much outside the absolute top end of the market. Top end madness in my experience is cyclical. It'll probably calm down again soon.

Drives are less complicated, if anything. 2TB of decent NVMe is like $80 if you shop around a little, and that drive will be faster and easier to install than anything readily available that came before it. A really nice 4TB NVMe? Less than $200.

There may appear to be a massive shift over the last few years from your perspective, but on a longer timeline it just looks like it has for the last 30 years.

Given the weird Microsoft mention I'm going to assume your hardware doesn't support Windows 11?
 
Last edited:
This reads like you came up with a theory and told yourself a story to make it work.

If an 8700, 32GB of RAM and a 1050ti were a multipurpose system then, the modern equivalent from AMD/Intel/Nvidia would be just as capable in every scenario now, if not more so, across the board.
Nope, X3d is only for gaming because its worse with everything else. AMD is terrible for video loads in both CPU and GPU which means that you are forced to go with Intel. Just a quick example.

Unless the goal is well over 100FPS at the kind of settings a Nvidia 50 or 60 (maybe even 70) series card was never capable of, there's no specialization. None.
Except all you needed to do was turn down shadows and AA and everything worked at highest settings.... not the case any more, there is no modern equivalent to the 1050ti in terms of power, price and general longevity. 4GB till last year was more than enough, now if you get an 8GB model it wont last you more than 2 years because of the new shift towards upscaling assumed enabled as default. Sure you can get a 7700XT which is the closest equivalent but it hammers you in terms of power draw and it has worse video load capability compared to it's 40 series competitors.... which means you have to make the choice essentially between a gaming, video and power juggle.... which was not necessary before because NVIDIA was the default with AMD if you were desperate.... now AMD is the attractive choice only if you want to focus only on gaming and power draw is of no concern basically forcing a lot of people to either go NVIDIA or not upgrade at all.

You buy a mid range CPU, you buy a GPU if that matters to you... Practically any RAM will work, so long as we're talking about like, DDR4-3200 or better. Literally any DDR5. Buy something on the QVL list for your board and stop thinking about it, same as always. 99.999% of people that use computers will never know what CAS latency is, and even fewer will ever care. Unless you're running a pre built with one stick of RAM or you've got too many Chrome tabs open on the regular, 32GB of /something/ is all nearly anyone really "needs". Power user, or can't be bothered to close Chrome tabs? Double it, you're done. Yes, pricing is going back up now, but not long ago at least here in the US that would run you like $100.
This used to be true but with new tendencies on Intel RAM speed is going to start to matter more and for AMD the speed of the RAM is directly tied to the speed of CPU performance.... meaning AMD seems initially like the cheaper choice until you notice with RAM it basically becomes pay to win.

PSUs haven't changed significantly unless you're on the bleeding edge, same as everything else. Most people weren't buying 1kw+ PSUs back in the 2000s to run tri-SLI 8800GTXs, and most people will be fine with half that now, same as always... and you'll need the best you can get to the tune of $2-300 if you're buying a top end card now. Yes, multi-GPU is gone, you can just get more power in a single card.
You are missing the point or perhaps I was not clear enough.... the problem is the massive power draw loads. Before it was a major investment to get a 500W PSU and later a massive one to get a 700W PSU.... and for OEM this is fine if you need 500W get a 700W PSU.... but now with GPU's having jumped in power draw across the board it's become a liability to not use a non-OEM PSU with them.... and 700-1kW branded PSU's are very expensive. If you need a new GPU you are basically forced to get a new PSU at the same time given that you will need a few hundred watt extra now that for the last decade you did not need.... and you have to shell out for the expensive PSU's.

Nobody's making you buy it, though. If you were ever happy with a 50 series card, stuff like raytracing never mattered to you. Sure, it wasn't "RTX ON", but I mean its equivalent at the time - HairWorks used to be the thing that tanked frame rates and people debated if it was needed or really added anything. Cloth simulation. PhysX. Heck, for a brief moment there if you were a really hardcore bleeding edge PC gamer you had a second, usually weaker GPU to run additional monitors and handle PhysX calculations.
The thing is with all these optional extras you could easily just disable them.... now though.... unless you enable them some of the games are so blurry you wonder who exactly is drunk. I noticed it first with RE 4 Remake... it looks so bad you almost get a headache.

The degree of specialization you're describing just doesn't exist - the "gaming" CPUs aren't soooo much better that the "office" CPUs are suddenly worthless for gaming. Only the most hardcore enthusiast is going to tell you buying anything other than an X3D chip for gaming is crazy... And they're not the ones buying 50 series Nvidia cards. Buying "slow" RAM loses you a few FPS, it's almost never the difference between something being playable and not.
Wrong rule of thumb. You can use any CPU for gaming... but you cannot use a gaming CPU for everything else. And this is only true right now, with the e core craze older games are going to start performing more poorly as time goes on, some are going to start starting up entirely.... it's fine if it is currently supported well enough to be patched.... but this never lasts.

A 500w PSU will still run a low-mid system like you're describing. Everything is vastly more power efficient than it has ever been. Literally everything.
Except that mid level GPU's expecially when coupled with mid to high level CPU's are starting to need 700-750W. 500W being standard is now a thing of the past. This is especially true for AMD hardware.

Yes, pricing is terrible, but if you account for inflation (admittedly insane, but that's a "the world is broken" kind of problem) they haven't changed that much outside the absolute top end of the market. Top end madness in my experience is cyclical. It'll probably calm down again soon.
That might explain prices being up to a quarter more expensive than they were.... but not up to 4 times as expensive as 4 years ago. The prices given the trends of the last few years are never going back down unless there is a massive market disruption.

Drives are less complicated, if anything. 2TB of decent NVMe is like $80 if you shop around a little, and that drive will be faster and easier to install than anything readily available that came before it. A really nice 4TB NVMe? Less than $200.
This might be true in the first world.... but for the rest of us there are extra markups and import taxes that at the least double the price. Now combine that with the corporations simply assuming that everyone rebuys everything every couple of years.... and you have a market and provider that live in seperate worlds. Only wealthy people upgrade just because they can because of FOMO, most people only upgrade once they are forced to. If you can upgrade all the time for minor increases in performance.... you are wealthy no matter what you tell yourself.

Only wealthy societies are victim to mindless consumerism.... and only they assume everyone else does it as well.

Where is the over specialization? If you're going Intel you get their current socket (I don't keep up but I know it's on the way out. If AMD and budget isn't your first and only real concern, AM5. Buy the board with the features you need/want. Same as always.

This may appear to be a massive shift over the last few years from your perspective, but on a longer timeline it just looks like it has for the last 30 years.
This is not over specialization, this is merely planned obselesence where the only way to actually bother to upgrade your CPU is when you do it within a few years.... I have never been able to upgrade my CPU when I actually need to, it's always been me forced to upgrade the board because I cannot find parts for it anymore.... so no that's not even factored in at all.

The closest thing to that is the way both GPU providers did customers a dirty by going 8x meaning only desperate people will buy their products if they still have PCIe3 mobos because these cards wont even run full PCIe3 speeds. But that's not over specialization that's just something we can speculate on... dunno if it's on purpose or the morons just forgot that PCIe3 mobos are still in service.

No in terms of over specialization the mistake is having too many different models with ill defined tiers.... instead of 3 or four GPU's targeting specific resolutions theres 8 or more just to see what sticks to the wall. When it comes to CPU's it's been a cluster-fk for a long time but now it's worse than it ever was because they are basically running uncontrolled experiments with their products to see what happens in the market.

And I did not even mention the high science that is monitor choice.

Choosing GPU and CPU combos never used to give anyone choice anxiety.
 
Nope, X3d is only for gaming because its worse with everything else. AMD is terrible for video loads in both CPU and GPU which means that you are forced to go with Intel. Just a quick example.


Except all you needed to do was turn down shadows and AA and everything worked at highest settings.... not the case any more, there is no modern equivalent to the 1050ti in terms of power, price and general longevity. 4GB till last year was more than enough, now if you get an 8GB model it wont last you more than 2 years because of the new shift towards upscaling assumed enabled as default. Sure you can get a 7700XT which is the closest equivalent but it hammers you in terms of power draw and it has worse video load capability compared to it's 40 series competitors.... which means you have to make the choice essentially between a gaming, video and power juggle.... which was not necessary before because NVIDIA was the default with AMD if you were desperate.... now AMD is the attractive choice only if you want to focus only on gaming and power draw is of no concern basically forcing a lot of people to either go NVIDIA or not upgrade at all.


This used to be true but with new tendencies on Intel RAM speed is going to start to matter more and for AMD the speed of the RAM is directly tied to the speed of CPU performance.... meaning AMD seems initially like the cheaper choice until you notice with RAM it basically becomes pay to win.


You are missing the point or perhaps I was not clear enough.... the problem is the massive power draw loads. Before it was a major investment to get a 500W PSU and later a massive one to get a 700W PSU.... and for OEM this is fine if you need 500W get a 700W PSU.... but now with GPU's having jumped in power draw across the board it's become a liability to not use a non-OEM PSU with them.... and 700-1kW branded PSU's are very expensive. If you need a new GPU you are basically forced to get a new PSU at the same time given that you will need a few hundred watt extra now that for the last decade you did not need.... and you have to shell out for the expensive PSU's.


The thing is with all these optional extras you could easily just disable them.... now though.... unless you enable them some of the games are so blurry you wonder who exactly is drunk. I noticed it first with RE 4 Remake... it looks so bad you almost get a headache.


Wrong rule of thumb. You can use any CPU for gaming... but you cannot use a gaming CPU for everything else. And this is only true right now, with the e core craze older games are going to start performing more poorly as time goes on, some are going to start starting up entirely.... it's fine if it is currently supported well enough to be patched.... but this never lasts.


Except that mid level GPU's expecially when coupled with mid to high level CPU's are starting to need 700-750W. 500W being standard is now a thing of the past. This is especially true for AMD hardware.


That might explain prices being up to a quarter more expensive than they were.... but not up to 4 times as expensive as 4 years ago. The prices given the trends of the last few years are never going back down unless there is a massive market disruption.


This might be true in the first world.... but for the rest of us there are extra markups and import taxes that at the least double the price. Now combine that with the corporations simply assuming that everyone rebuys everything every couple of years.... and you have a market and provider that live in seperate worlds. Only wealthy people upgrade just because they can because of FOMO, most people only upgrade once they are forced to. If you can upgrade all the time for minor increases in performance.... you are wealthy no matter what you tell yourself.

Only wealthy societies are victim to mindless consumerism.... and only they assume everyone else does it as well.


This is not over specialization, this is merely planned obselesence where the only way to actually bother to upgrade your CPU is when you do it within a few years.... I have never been able to upgrade my CPU when I actually need to, it's always been me forced to upgrade the board because I cannot find parts for it anymore.... so no that's not even factored in at all.

The closest thing to that is the way both GPU providers did customers a dirty by going 8x meaning only desperate people will buy their products if they still have PCIe3 mobos because these cards wont even run full PCIe3 speeds. But that's not over specialization that's just something we can speculate on... dunno if it's on purpose or the morons just forgot that PCIe3 mobos are still in service.

No in terms of over specialization the mistake is having too many different models with ill defined tiers.... instead of 3 or four GPU's targeting specific resolutions theres 8 or more just to see what sticks to the wall. When it comes to CPU's it's been a cluster-fk for a long time but now it's worse than it ever was because they are basically running uncontrolled experiments with their products to see what happens in the market.

And I did not even mention the high science that is monitor choice.

Choosing GPU and CPU combos never used to give anyone choice anxiety.
Still telling yourself that story.
 
  • Like
Reactions: Order 66
Still telling yourself that story.
Exactly, to address OP's point about 4GB of VRAM being enough until recently, I say absolutely not, my RX 550 was struggling at 1080p low settings, now sure you could argue that the RX 550 doesn't have enough horsepower to utilize 4GB of VRAM, but even a gtx 1650 super, which had the horsepower to utilize it, would struggle at more than 1080p low or medium settings in modern AAA games.
 
Exactly, to address OP's point about 4GB of VRAM being enough until recently, I say absolutely not, my RX 550 was struggling at 1080p low settings, now sure you could argue that the RX 550 doesn't have enough horsepower to utilize 4GB of VRAM, but even a gtx 1650 super, which had the horsepower to utilize it, would struggle at more than 1080p low or medium settings in modern AAA games.
And yet last year was the first time a single game struggled with it for me..... I can count the games that struggle with it on one hand and excluding cyberpunk all of them came out last year.

And by struggle I mean looks terrible and/or runs badly even at low settings. I went basically overnight from turning down a setting here or there but staying on high to ultra for most to turning everything down and still struggling.

Maybe this is the disconnect between me and you two.... I really don't need for a game to look perfect, it just needs to run properly and not look terrible. Also I completely skip all MP games so I cannot speak for them.

Even with those permissive requirements though, to prove the point.... I cannot find a single GPU on the market that will last me as long as the 1050ti did in the same class if I buy one now given the way they crippled them.

BTW.... I can passibly play Starfield with it, that gave me a laugh. Just put everything on max, put the resolution scaler on 75% and let the best output win.... it looks fine just has a few minor slowdowns a few times for a few seconds in an hour.
 
  • Like
Reactions: Order 66
And yet last year was the first time a single game struggled with it for me..... I can count the games that struggle with it on one hand and excluding cyberpunk all of them came out last year.

And by struggle I mean looks terrible and/or runs badly even at low settings. I went basically overnight from turning down a setting here or there but staying on high to ultra for most to turning everything down and still struggling.

Maybe this is the disconnect between me and you two.... I really don't need for a game to look perfect, it just needs to run properly and not look terrible. Also I completely skip all MP games so I cannot speak for them.

Even with those permissive requirements though, to prove the point.... I cannot find a single GPU on the market that will last me as long as the 1050ti did in the same class if I buy one now given the way they crippled them.

BTW.... I can passibly play Starfield with it, that gave me a laugh. Just put everything on max, put the resolution scaler on 75% and let the best output win.... it looks fine just has a few minor slowdowns a few times for a few seconds in an hour.
By that logic, buy a 4090, and run your games at 1080p low settings. It will last for years.
 
  • Like
Reactions: 35below0
By that logic, buy a 4090, and run your games at 1080p low settings. It will last for years.
How on earth do you get that? No one in his right mind buys a 4090 unless he has pockets deep enough that he does not care to upgrade every generation, it's a completely different market almost, the above high class market. You are comparing Uno buyers to Ferrari buyers here and pretending it's a fair comparison.

The point is I bought a Graphics Card that was just between low and mid range..... and it was completely viable for about 6 years at 1080p.

What alternative new now at the same tier will be able to achieve the same or better result even though they are literally 4 times the price?
 
  • Like
Reactions: Order 66
How on earth do you get that? No one in his right mind buys a 4090 unless he has pockets deep enough that he does not care to upgrade every generation, it's a completely different market almost, the above high class market. You are comparing Uno buyers to Ferrari buyers here and pretending it's a fair comparison.

The point is I bought a Graphics Card that was just between low and mid..... and it was completely viable for about 6 years at 1080p.

What alternative new now at the same tier will be able to achieve the same or better result even though they are literally 4 times the price?
I realize now that you said same class as your 1050 ti, sorry I missed that. My point was that a 4090 would last just as long at 1080p as your 1050 ti. Which games are you referring to?
 
I realize now that you said same class as your 1050 ti, sorry I missed that. My point was that a 4090 would last just as long at 1080p as your 1050 ti. Which games are you referring to?
Actually a 4090 would last a few years longer if you look at recent benchmarks for the 1080.

From memory with the tests I ran.

Games that run slow but playable on low: Robocop, Cyberpunk, Starfield
Games that run full speed but only on medium: Aliens Dark Descent, JA3
Games that looks like barf or are unplayable: RE4R (I prefer the original anyway, no weird charms nonsense and moddable), Outer Worlds Fancy Edition

Of the lot I only stuck with JA3.... 2023 was a very disappointing year for me.... the only games I actually have actual trouble with are the ones where the devs try to force upscaling on you... and the 10 series does not upscale well.... and FSR is flakey.

Funny thing is the first time I came up against VRAM problems was RE2R.... and even with everything set for below 4GB it looks gorgeous so dunno what everyone was complaining about.... probably a lot of 1.4k and 4k complainers....

But that brings us back to one of the main reasons that GPU's obsolete faster now and the main excuse they have for hiking those juicy margins.... upscaling at the lower end because of the builtin artificial bandwidth and VRAM limitations is a complete waste of resources. Of course they also did an obvious slight of hand, they kept the budget but soon to be useless tier but renamed it from 30 to 60 while the 50 is now the 70.... and that is even more expensive, almost double the price of the 60 by the time it gets to market. And for some inexplicable reason AMD keeps playing by NVDIA rules instead of undercutting them.

The only GPU that comes close to replacing the 1050ti is the 7700XT and it has the drawback of not being able to compete with NVIDIA in terms of encoding. Of course with the recent development of suddenly having trouble I started looking into upgrading.... but nothing so far has given me enough justification to think I won't be wasting good money on expensive soon to be garbage. And again, especially given the extra load to the power bill and the need to also buy a new expensive PSU.... it's still 4 times the price as downpayment and more over time.

TBH.... at this point.... its starting to look as if it just makes more sense to give up even trying and sticking to lower end games of which there are plenty. And I highly doubt I am the only one starting to think like that.
 
I do agree somewhat that while upscaling can be useful, especially on older or lower end hardware, it feels like at this point it is being used as a crutch so that devs don't have to optimize their games. Don't even get me started on how stupid the performance for AW2 is on a 1080ti. I don't understand why mesh shaders is a requirement when you can make a game look just as good without them.
 
Discrete GPUs are becoming more of a niche product. So if the cheapest viable GPU costs you $200 (though the A380 is available as well) in your $600-700 gaming PC, we are basically at the same rough price point for an entry level desktop with a few years of operable life.

You could just drop in a RX6600 and call it a day with an i7-8700 (Which was far from a budget CPU, and still pretty decent)

Need only look at the latest APUs from AMD and Intel for why lower end GPUs aren't worth making.

Ryzen 78/7900 series APUs with RDNA onboard and the custom Z1 APUs found in the Steamdeck and others.
MeteorLake with Arc on board is fairly capable as well.

And both of these are first generation products, without discrete memory. They go back to packaging with HBM (Intel did it with an AMD Vega) and you might see some seriously interesting chips in the near future.

PCPartPicker Part List

CPU: Intel Core i3-12100F 3.3 GHz Quad-Core Processor ($98.98 @ Amazon)
Motherboard: ASRock B760M-H/M.2 Micro ATX LGA1700 Motherboard ($89.99 @ Amazon)
Memory: G.Skill Ripjaws S5 32 GB (2 x 16 GB) DDR5-6000 CL36 Memory ($83.89 @ Amazon)
Storage: Crucial P3 1 TB M.2-2280 PCIe 3.0 X4 NVME Solid State Drive ($63.99 @ Amazon)
Video Card: PowerColor Fighter Radeon RX 6600 8 GB Video Card ($194.99 @ Amazon)
Case: Cooler Master MasterBox Q300L MicroATX Mini Tower Case ($39.99 @ Amazon)
Power Supply: MSI MAG A650BN 650 W 80+ Bronze Certified ATX Power Supply ($59.99 @ Amazon)
Total: $631.82
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2024-02-13 15:08 EST-0500
 
  • Like
Reactions: Order 66
I do agree somewhat that while upscaling can be useful, especially on older or lower end hardware, it feels like at this point it is being used as a crutch so that devs don't have to optimize their games. Don't even get me started on how stupid the performance for AW2 is on a 1080ti. I don't understand why mesh shaders is a requirement when you can make a game look just as good without them.
They do offer an improvement.... this could be T&L all over again.
 
  • Like
Reactions: Order 66
I do agree that there really isn't a stellar GPU on offer from this generation of Nvidia.
But i don't really agree with the rest. And certainly not about AMD CPUs. And i'm not a fan of AMD in general.

It's fine to reason and offer thoughts, but ... As someone who spent 11 years not thinking or caring about PC components, then re-learned everything in order to build a new PC. I was very surprised how little has changed. Same story. Set your budget, decide on some surefire picks, assemble the rest on paper until it all matches and works together and there's enough power. Maybe tweak some component here and there, adjust for brand availability or special sale prices, and then order the stuff and put it together.
This is for a general PC that does everything and is not top of the class in gaming or whatever.

From 3570K to 13600K, all that has changed is the SSDs have arrived and with them NVMes. Also, a bit less important but on board graphics suck even less now (though i did buy a discreet GPU).
Ok, RAM compatibility is maybe more finnicky than it used to be.
cpu coolers definetly grew in size too...

But otherwise, same old song and dance.

Since everyone defines for themselves what a PC needs to be in order to make them happy, maybe it really is more difficult for you to reach your goal? I wouldn't say it's a general trend towards specialization though.
Usually that would also mean less sales overall so i don't see manufacturers going this route. Even motherboards or CPUs or RAM kits trying to cater to specific needs will still do 99% of average PC computing just as well as their non-specialized counterparts.