truerock :
I can't say your points are not valid - but, I think they would have been more compelling a year or two in the past.
Consider this: if Apple was manufacturing the GeForce RTX 2080 Ti Founders Edition, it would not have HDMI or DisplayPort ports. Consider why that is true.
A lot of individuals had very negative feelings about Apple introducing the lightning-connector for iPhones. When Apple dropped almost all ports from its notebook PCs and replaced them with USB-C ports there was a lot of negative reaction. When Apple dropped the audio-jack from its iPhones there was a lot of very emotional reactions. Apple dropping the "home button"... Apple dropping fingerprint-security... it all caused very negative reactions from a lot of people.
But, I think it is beginning to sink in. Every time Apple drops something and moves on to the next thing there seems to be less negative reaction - and, now it seems like there is becoming almost an instantaneous reaction to follow Apples lead into the future while dropping older technology more quickly. When Apple started using "The Notch" there was maybe a week or 2 of negative reaction and then instant adoption across the industry.
It's not so much that Apple is so very smart as it has always recognized the low value of supporting old technology and the drag it puts on moving forward. I own lots of old technology - but, I don't want it cluttering up my new technology. The fact that USB 3.2 is hamstrung by requiring USB 2.0 support is incredibly ridiculous - in my opinion. In fact if Thunderbolt controllers drop down to around $2 per for 1,000, I could see Apple dropping USB.
You're comparing apples (ha!) to oranges there though.
Apple dropping full-sized USB ports hasn't made type-C devices the 'norm', it's made third party (and even first party) dongles/adapters a common thing.
For all the other changes (notch, removing fingerprint sensor, home button), let's not kid ourselves that it's for the purpose of moving forward with technology. A lot of it is to do with cutting costs. The industry (generally speaking) will adopt similar cost-cutting measures, if it's shown to not hurt the bottom line.
Even with that, it's not like everything worked without issue.
1. Notch - app developers aren't fully there yet. Cutting off part of the screen is still an issue in some implementations
2. I think ti was LTT who had a video on their iPhone. FaceID is not everywhere, and a defect in the software (or screen?), under the right/wrong circumstances, can render the phone useless without a physical homebutton.
There's a crossover period where both older & newer connectivity/features should coexist.
So, to the HDMI/DP/DVI vs USB-C debate..... while yes, any given Mfg could cut costs in omitting these connectors, I'm pretty confident in saying they wouldn't be cutting costs with no negative impact on sales. Forcing users to upgrade monitors, simply to use a new GPU, would not go well.
In time, I'm sure we'll see HDMI/DP/DVI (one or more of them) drop off consumer level cards..... I just don't see it happening any time soon.