News Control PC Requirements and What Ray Tracing Will Look Like

s997863

Distinguished
Aug 29, 2007
143
10
18,685
Reminds me of PhysX in Batman Arkham Asylum (papers flying), and DirectX10 in Crysis (sun rays): Insignificant effects at an unreasonable performance cost, all of which could easily be implemented much better using other hardware/software methods already available at the time.
The third screenshot doesn't even make sense: why can't you have the shadows off the grills on the pipe? If the level is static (won't move or change) then this could be done way back in quake 2. The real use for ray-tracing is to have dynamic moving shadows, like the imps in Doom 3.
 
  • Like
Reactions: bit_user

salgado18

Distinguished
Feb 12, 2007
932
376
19,370
This is what happens with proprietary tech: nice stuff has to be optional and irrelevant to the game, because other people that don't have the hardware must enjoy the game without them.

It's sad, really, because PhysX on hardware could enable so many cool gameplay opportunities, if games could depend on it. Also, more GPUs could be sold, second cards for physics effects could become normal, etc. But limiting it to CPU simulation with Radeon cards mean devs can't make a game around it, limiting the appeal to... papers on the ground.

Nvidia can have all the top tech they want, going the proprietary route means not just us consumers, but they lose too.
 
Aside from the first screenshot, I'm more surprised by the lack of differences.

Graphically, I don't think I'd mind playing it without RTX.

You can't see how much clearer her reflection is in the second image or how you can see a reflection of the ceiling in the 3rd image?

Granted being able to see the reflection of the ceiling in a puddle probably wouldn't help you telekinetically rend a foe in twain ... unless the enemy is hiding on the ceiling spiderman style, in which case we torch the room because spiders hate fire.

Playing without RTX is perfectly fine as long as you accept that puddles will always be a shade of gray.
Edit: Or a blurry bitmap of what the developer thinks you should see

#PuddlesMatters
 

Warsaw

Distinguished
Jul 8, 2008
251
4
18,795
I'm always about increased graphical fidelity in games or added effects that give it more realism. However, I must say with these screenshots from the game, there is little that impresses me. Yes, I would like the more realistic reflections of the puddles in my game. But at the same time, it's not showing me much that really draws my attention towards RTX.
 

bit_user

Polypheme
Ambassador
You can't see how much clearer her reflection is in the second image or how you can see a reflection of the ceiling in the 3rd image?
Heh, the two reflection examples you cite don't need RTX, and I feel the non-RTX reflection of her was intentionally distorted. It looks more realistic to me, anyway.

Granted being able to see the reflection of the ceiling in a puddle probably wouldn't help you telekinetically rend a foe in twain ... unless the enemy is hiding on the ceiling spiderman style, in which case we torch the room because spiders hate fire.
It would be poor game design for extra graphical options to make a difference in playability, especially when most people can't use them.
 

bit_user

Polypheme
Ambassador
I'm always about increased graphical fidelity in games or added effects that give it more realism. However, I must say with these screenshots from the game, there is little that impresses me. Yes, I would like the more realistic reflections of the puddles in my game. But at the same time, it's not showing me much that really draws my attention towards RTX.
If you zoom in, her face does look better. That's at least something.

I thought her skin looked different, but as I A/B the images, I now see that it's the ambient occlusion on her eyeballs and the soft shadows from her nose and hair. Altogether, it makes a big difference, since the skin was good to start with. With RTX, it looks to me almost like a real photograph.
 

The Net Avenger

Distinguished
Sep 12, 2014
16
7
18,515
Reminds me of PhysX in Batman Arkham Asylum (papers flying), and DirectX10 in Crysis (sun rays): Insignificant effects at an unreasonable performance cost, all of which could easily be implemented much better using other hardware/software methods already available at the time.
The third screenshot doesn't even make sense: why can't you have the shadows off the grills on the pipe? If the level is static (won't move or change) then this could be done way back in quake 2. The real use for ray-tracing is to have dynamic moving shadows, like the imps in Doom 3.

But you cannot implement the same visual using other techniques. Notice the first image where OFF SCREEN and dynamic content is able to be referenced and reflected in realtime.

We more multiplayer games, reference upcoming consoles, users that DO NOT have ray tracing enabled will be at a significant disadvantage, as they won't be able to see player's reflections around a corner or behind them.

The other misconception repeated here is that ray tracing will always have a performance cost. This isn't true even now. If you look up videos or blogs by game developers - they specifically explain how games would be faster IF THEY DIDN"T HAVE TO USE THE NON RT 'tricks' to make scenes appear more realistic.

All the extra 'textures' and false light sources are consuming MORE performance today, than if the developer could just not put them in the game, and instead flip on the most basic 1 sample level ray tracing.


The other things you cite, PhysX and DX10 visuals, are a part of gaming today, you are mentioning their 'entry point'. PhysX is common today, and is not limited to nVidia cards either.

These were only slower when using false comparisons, or a visual or PhysX layer slapped on top of an older engine, when only added to the visual quality, without utilizing any of the performance technologies that these technologies were able to add.

---Off Topic Information
DX10 effects were 'common' when it was introduced, as the XB360 uses a superset of DX10, which is technically similar to DX11. And the reason the XB360 was able to compete and out perform the PS3 in graphics, was because of the DX10/11 technologies when implemented at lower levels than just an upper layer visual effect. Even today, games that use a TRUE DX10/11 based engine, it is faster than than if the game was running on an older DX9 or earlier engine model.

DX11 was faster than DX9 - this is how Value was able to fool people with their OpenGL vs DirectX and Linux vs Windows game performance con.

They used the OpenGL 4.x features copied from DX11, and rebuilt the game, and then compared the results to the DX9 version running on Windows. When that same game was rebuilt for DX11, it was nearly 2x faster than DX9, and 1.5-2x faster than the OpenGL 4.x version that Value was misleading people to make them believe OpenGL or Linux was faster, when neither was true. Even John Caramack commented on Valve's deception
 

The Net Avenger

Distinguished
Sep 12, 2014
16
7
18,515
This is what happens with proprietary tech: nice stuff has to be optional and irrelevant to the game, because other people that don't have the hardware must enjoy the game without them.

It's sad, really, because PhysX on hardware could enable so many cool gameplay opportunities, if games could depend on it. Also, more GPUs could be sold, second cards for physics effects could become normal, etc. But limiting it to CPU simulation with Radeon cards mean devs can't make a game around it, limiting the appeal to... papers on the ground.

Nvidia can have all the top tech they want, going the proprietary route means not just us consumers, but they lose too.

You do realize, that the RTX ray tracing features, come from Microsoft - see DXR. Even the nVidia AI/ML technologies are based on Microsoft's WinML.

These are not proprietary, as DXR and WinML has been freely shared with the entire industry, both hardware and software. Meaning that Intel and AMD have full access to the SAME ray tracing hybrid technologies that nVIdia does from Microsoft. This also includes the software side, with Microsoft having given these technologies to everyone, even helping the OSS Vulkan developers to implement DXR and ML technologies, just like DX12 provides.

I know nVidia is the first company to implement DXR with specific hardware, but DXR works on any DX12 feature card, and AMD/INtel can ALSO build similar hardware to provide the same acceleration. This is also how nVidia was able to turn on RT on older cards, as it is just DX12 DXR and doesn't need specialized hardware to work, even though special hardware does help the performance.

This is not nVidia proprietary technology.
 

bit_user

Polypheme
Ambassador
We more multiplayer games, reference upcoming consoles, users that DO NOT have ray tracing enabled will be at a significant disadvantage, as they won't be able to see player's reflections around a corner or behind them.
If games designers choose to make level designs that create such disparities. And if people without RTX realize they're at a disadvantage, they're probably at least as likely to stop playing than to upgrade the video card. So, it's a real risk for game designers.

The other misconception repeated here is that ray tracing will always have a performance cost. This isn't true even now. If you look up videos or blogs by game developers - they specifically explain how games would be faster IF THEY DIDN"T HAVE TO USE THE NON RT 'tricks' to make scenes appear more realistic.
Yeah, but it also has a lot to do with which RT effects they're using.

All the extra 'textures' and false light sources are consuming MORE performance today, than if the developer could just not put them in the game, and instead flip on the most basic 1 sample level ray tracing.
Sure, they get rid of a couple extra rendering passes, but there are other tradeoffs. For instance, I don't imagine TAA works well with reflections. If you suddenly start using a lot more reflections because now they're easy and pretty good-looking, there's going to be a lot of aliasing in them.

the reason the XB360 was able to compete and out perform the PS3 in graphics, was because of the DX10/11 technologies when implemented at lower levels than just an upper layer visual effect.
Do you believe that to be a consensus opinion that XB360 out-performed PS3 in graphics? As a late-comer to the PS3, I played many games that looked way better than what I've seen on XB360, although I'll admit to not having seen many XB360 games. By the end of PS3's reign, I was amazed at what people were able to squeeze out of it, though even a few of the early titles looked clean and impressive.

They used the OpenGL 4.x features copied from DX11, and rebuilt the game, and then compared the results to the DX9 version running on Windows. When that same game was rebuilt for DX11, it was nearly 2x faster than DX9, and 1.5-2x faster than the OpenGL 4.x version that Value was misleading people to make them believe OpenGL or Linux was faster, when neither was true. Even John Caramack commented on Valve's deception
You can't just lump all OpenGL 4.x versions together. OpenGL 4.2 and 4.3 added instanced and indirect rendering enhancements. 4.3 added compute shaders. 4.5 added conditional rendering and direct state access. And there's also a question of how effectively something is ported to OpenGL (or Direct 3D) and whether it uses all the available tricks to avoid pipeline bubbles, etc.

BTW, it's not like the OpenGL ARB just sits around, going through the latest Direct 3D spec to find things they want to copy from it. In actual fact, it's comprised of the major hardware vendors + prominent software vendors who discuss features of upcoming hardware and the best way to make that usable by software. In the case of both OpenGL and Direct 3D, they're basically subject to what the hardware vendors decide to build, and have to wrap that. Although, I'm sure that in both cases, the software vendors feed a lot of wishes and requirements to help guide the hardware implementations.
 

bit_user

Polypheme
Ambassador
You do realize, that the RTX ray tracing features, come from Microsoft - see DXR. Even the nVidia AI/ML technologies are based on Microsoft's WinML.
Um, no. Nvidia approached MS about ray tracing, and the conjecture I heard is that MS jumped on it after seeing software developers starting to move away from Direct 3D, in favor of Vulkan. So, for Microsoft, DXR was a bid to win them back. And, from what I've heard, it worked.

And it's laughable that MS is driving Nvidia's deep learning tech, when their main market for that stuff is Linux-based and the main APIs and frameworks that people use are built directly atop CUDA.

These are not proprietary, as DXR and WinML has been freely shared with the entire industry, both hardware and software. Meaning that Intel and AMD have full access to the SAME ray tracing hybrid technologies that nVIdia does from Microsoft. This also includes the software side, with Microsoft having given these technologies to everyone, even helping the OSS Vulkan developers to implement DXR and ML technologies, just like DX12 provides.
You're ignoring the question of when other industry players get access to the new APIs. I have it on pretty good authority that AMD was left out in the cold, only getting late notice of DXR. In order to be competitive, they need the inside track, so they can have products ready at or around the time the new API is released.

This is contrary to how open standards work, where anyone can join a Khronos working group and everyone can participate in the meetings and has a vote on the decisions around new standards/revisions. So, no one gets an exclusive inside track.

I know nVidia is the first company to implement DXR with specific hardware, but DXR works on any DX12 feature card, and AMD/INtel can ALSO build similar hardware to provide the same acceleration. This is also how nVidia was able to turn on RT on older cards, as it is just DX12 DXR and doesn't need specialized hardware to work, even though special hardware does help the performance.
You're completely missing the fact that software implementations are so much slower than hardware ray tracing that they're impractical to use for anything that was targeted at a hardware implementation. The main benefit of being able to implement it in software is just so software developers get early access, and can do some development & debugging on GPUs that don't have it in hardware.
 
Heh, the two reflection examples you cite don't need RTX, and I feel the non-RTX reflection of her was intentionally distorted. It looks more realistic to me, anyway.


It would be poor game design for extra graphical options to make a difference in playability, especially when most people can't use them.

Rejecting the benefits of real life optics / ray tracing in an effort to make games more fair is a losing battle.

The simple truth is that your eyes see more data/detail when the scene is ray-traced versus a scene that was rasterized.

This increase in data/detail allows you to make more informed/hopefully better decisions in the game.


The most absurd but still completely viable scenario would be a mirror placed at a 45 degree angle on a corner.

This would allow you to see around the corner with ray-tracing.

This is a very specific scenario but the mirror doesn't need to be at 45 degrees ... you don't need to be able to see the complete enemy, just enough to know some one or something is there is enough.

And for that matter if you were going to make the game ray tracing friendly for those without ray tracing you would not be able to put a mirror at any spot near a corner on the off chance that a person with ray tracing may be able to "exploit" the laws of gaming optics.
 

bit_user

Polypheme
Ambassador
Rejecting the benefits of real life optics / ray tracing in an effort to make games more fair is a losing battle.
That's a legitimate opinion, but it doesn't mean game designers will opt for realism over fairness. At least, not until ray tracing becomes more common.

The most absurd but still completely viable scenario would be a mirror placed at a 45 degree angle on a corner.

This would allow you to see around the corner with ray-tracing.
In point of fact, it's not as if there's no other way to render such reflections. If such mirrors had a strategic significance, within the game, they could still correctly render them, at some additional cost. Where it gets really difficult is to do accurate, non-planar reflections.

And for that matter if you were going to make the game ray tracing friendly for those without ray tracing you would not be able to put a mirror at any spot near a corner on the off chance that a person with ray tracing may be able to "exploit" the laws of gaming optics.
Exactly. That's the kind of compromise game designers now have to make, if they want to prioritize fairness over realism. Windows pose a bit of a challenge, in this regard. I wonder if any games would go so far as not to use ray-traced reflections in windows, even with RTX on, just to avoid advantaging RTX users. Or, will they just add the extra rendering pass, so everyone sees accurate reflections in them?

Of course, another potential advantage RTX could give is accurate shadows. Imagine seeing a room darken because someone blocks an indirect light source. The RTX user could see this and infer there must be someone standing in a doorway, even though he can't see them or their direct shadow. If the game uses RTX for global illumination, it's probably harder for game designers to avoid this scenario.
 

bit_user

Polypheme
Ambassador
Of course, another potential advantage RTX could give is accurate shadows. Imagine seeing a room darken because someone blocks an indirect light source. The RTX user could see this and infer there must be someone standing in a doorway, even though he can't see them or their direct shadow. If the game uses RTX for global illumination, it's probably harder for game designers to avoid this scenario.
At the end of this video, they demonstrate a technique called "light probes", to simulate dynamic global illumination without ray tracing. It's hard to see if it would exactly emulate the effect I was talking about, but it's conceivable.

 

The Net Avenger

Distinguished
Sep 12, 2014
16
7
18,515
Um, no. Nvidia approached MS about ray tracing, and the conjecture I heard is that MS jumped on it after seeing software developers starting to move away from Direct 3D, in favor of Vulkan. So, for Microsoft, DXR was a bid to win them back. And, from what I've heard, it worked.

And it's laughable that MS is driving Nvidia's deep learning tech, when their main market for that stuff is Linux-based and the main APIs and frameworks that people use are built directly atop CUDA.


You're ignoring the question of when other industry players get access to the new APIs. I have it on pretty good authority that AMD was left out in the cold, only getting late notice of DXR. In order to be competitive, they need the inside track, so they can have products ready at or around the time the new API is released.

This is contrary to how open standards work, where anyone can join a Khronos working group and everyone can participate in the meetings and has a vote on the decisions around new standards/revisions. So, no one gets an exclusive inside track.


You're completely missing the fact that software implementations are so much slower than hardware ray tracing that they're impractical to use for anything that was targeted at a hardware implementation. The main benefit of being able to implement it in software is just so software developers get early access, and can do some development & debugging on GPUs that don't have it in hardware.

You think AMD was left out in the cold? By Microsoft? On a technology there were showing to Intel, AMD, and NVidia over 3 years ago?

A technology that Microsoft DEMANDS to be present in AMD hardware for the XBox 2020 release?

A technology, that MIcrosoft has ALREADY given away to anyone that wants to go read how it works in hardware or software?

A technology Microsoft took to Khronos and worked with them along with NVidia's hardware so there were be proper OSS of the technologies in Vulkan?

A technology that Microsoft is already running on AMD GPUs for XBox 2020...

A technology that Sony is already running, in Vulkan, on AMD GPUs....



Um, I don't know your AMD source, but I think they weren't truthful.


Additional things...

AMD didn't have the hardware, not just the WinML/DXR acceleration technologies ready, but their main cores were too far behind Turing. This is their GPU catch up year, and should launch their GPUs like a rocket next year, and possible even do the RT features faster and better than NVidia, as they have Microsoft's hardware team giving them back information on Microsoft AMD GPU modifications. AMD was expected to have the features from the XBox One X GPU in this generation, and it didn't happen. By moving huge chunks of low level calls from software to the silicon in the XBox One X, the performance jump is impressive, with examples from developers where 1000 calls are now a single call.

AMD and Microsoft are weird allies, just as Microsoft's relationship with NVidia. However, neither company has been restricted from technology from Microsoft, as the whole point of designing a software platform and the necessary hardware, is to get as many companies as possible to implement it.

Giving of technology to support their platforms and the industry technologies is not something new to Microsoft. Even with DirectX, they have shared with everyone. Microsoft walked from OpenGL, because didn't want to touch gaming and didn't want to touch 3D GPU technologies. However, they gave key features of DirectX to NVidia to modify and OpenGL to use and modify, as it also encouraged new hardware in these directions by having more support from OpenGL. (The shader languages we use today were created by Microsoft, NVidia created a variation for their hardware, an exact copy of the Microsoft shader language was given to OpenGL to implement. Reference: User shader languages late 90s.)


As a person that has wandered between designed OS technologies to going and forth to the graphical and video industries since the late 80s, hybrid Ray Tracing is something to really look forward to and it is brilliant.

This is the 'graphical' jump we need from almost real to 'real' graphics, as right now we are still basically working with and getting games that have been hitting the limits of the 2012 technologies.


Even if it is just a flicker here, or a less shiny surface there, or a reflection from behind you, or a shimmer off the water with real caustics.... This are tiny things, but make a huge difference in realism.

Go look at the Doom RT demos and how in play how real it looks at times with the RT effects even with older textures and resolutions. Heck, look at 8bit Minecraft with RT enabled, it looks more 'real' than anyone thought was possible from just a few lighting/reflection/shadow effects.


Take and thanks the response. I apologize for resurrecting this thread/post, but I don't always get back here every week.
 
Last edited:

bit_user

Polypheme
Ambassador
You think AMD was left out in the cold? By Microsoft? On a technology there were showing to Intel, AMD, and NVidia over 3 years ago?

A technology that Microsoft DEMANDS to be present in AMD hardware for the XBox 2020 release?

A technology, that MIcrosoft has ALREADY given away to anyone that wants to go read how it works in hardware or software?

A technology Microsoft took to Khronos and worked with them along with NVidia's hardware so there were be proper OSS of the technologies in Vulkan?

A technology that Microsoft is already running on AMD GPUs for XBox 2020...

A technology that Sony is already running, in Vulkan, on AMD GPUs....


Um, I don't know your AMD source, but I think they weren't truthful.
Speaking of sources, can you actually back up those statements?

For one thing, I don't see anyone from Microsoft credited for contributing to Vulkan: https://github.com/KhronosGroup/Vulkan-Docs/blob/master/appendices/credits.txt

AMD was expected to have the features from the XBox One X GPU in this generation, and it didn't happen. By moving huge chunks of low level calls from software to the silicon in the XBox One X, the performance jump is impressive, with examples from developers where 1000 calls are now a single call.
Such as?

BTW, in graphics you tend to have cases like that, such as when moving tessellation onto the GPU or adopting instanced rendering. But, neither of those things is terribly new.

AMD and Microsoft are weird allies, just as Microsoft's relationship with NVidia.
You'd think that, but it didn't seem to keep AMD from getting shut out of the design of the DXR API. The structure of the DXR API was sketched out by Nvidia, and they did it in a way that was bad for AMD's hardware.

Then, I heard that software vendors pressured Khronos to adopt a similar API structure for ray tracing, to make it easier for their engines to be portable between Vulkan and D3D/DXR. So, the same "mistakes" were repeated there.

However, neither company has been restricted from technology from Microsoft, as the whole point of designing a software platform and the necessary hardware, is to get as many companies as possible to implement it.
I agree with your perspective, but the truth doesn't always line up with the presumed incentives.

Microsoft walked from OpenGL, because didn't want to touch gaming and didn't want to touch 3D GPU technologies.
Huh?

However, they gave key features of DirectX to NVidia to modify and OpenGL to use and modify, as it also encouraged new hardware in these directions by having more support from OpenGL.
What exactly do you mean by "gave key features"?

As a person that has wandered between designed OS technologies to going and forth to the graphical and video industries since the late 80s, hybrid Ray Tracing is something to really look forward to and it is brilliant.
Yeah, realtime ray tracing was always the holy grail. Computers can now officially do everything I always wished they could. I thought I'd be more enthusiastic about that... but it's kinda sad, not having anything else to look forward to. I don't really care about smarter AI or anything like that.

Take and thanks the response. I apologize for resurrecting this thread/post, but I don't always get back here every week.
I think you have some interesting perspectives and I'd like to hear more. I'd rather someone take their time and reply when they can. So, thanks for getting back to this.
 

The Net Avenger

Distinguished
Sep 12, 2014
16
7
18,515
Speaking of sources, can you actually back up those statements?

For one thing, I don't see anyone from Microsoft credited for contributing to Vulkan: https://github.com/KhronosGroup/Vulkan-Docs/blob/master/appendices/credits.txt


Such as?

BTW, in graphics you tend to have cases like that, such as when moving tessellation onto the GPU or adopting instanced rendering. But, neither of those things is terribly new.


You'd think that, but it didn't seem to keep AMD from getting shut out of the design of the DXR API. The structure of the DXR API was sketched out by Nvidia, and they did it in a way that was bad for AMD's hardware.

Then, I heard that software vendors pressured Khronos to adopt a similar API structure for ray tracing, to make it easier for their engines to be portable between Vulkan and D3D/DXR. So, the same "mistakes" were repeated there.


I agree with your perspective, but the truth doesn't always line up with the presumed incentives.


Huh?


What exactly do you mean by "gave key features"?


Yeah, realtime ray tracing was always the holy grail. Computers can now officially do everything I always wished they could. I thought I'd be more enthusiastic about that... but it's kinda sad, not having anything else to look forward to. I don't really care about smarter AI or anything like that.


I think you have some interesting perspectives and I'd like to hear more. I'd rather someone take their time and reply when they can. So, thanks for getting back to this.

If my comment about Microsoft walking away from OpenGL in the 90s is a 'huh' response, there are a lot of things you might want to catch up on.

Short version...
Microsoft created WinG/DirectX after OpenGL struck down gaming and 3D GPU specific changes to OpenGL. Microsoft did not want to have to create a different framework, but had to and walked away from OpenGL. OpenGL eventually 'got it' a few years later, and started implementing features from DirectX.


I don't even know where to start with many of the other things you mentioned or assumed...

Like comparing what Microsoft did with the XBox One X GPU to shoving tessellation into the GPU?

FFS. Microsoft added tessellation to the XBox 360 GPU in 2005.


Anyway...
The stuff I mentioned is not hard to find, and I assumed to be more commonly known than apparently it is. However, anyone that is curious can find the references and information if they want. If you don't care enough to research it, it doesn't matter.
 

bit_user

Polypheme
Ambassador
If my comment about Microsoft walking away from OpenGL in the 90s is a 'huh' response, there are a lot of things you might want to catch up on.
Um, no. But this comment did.

Microsoft created WinG/DirectX after OpenGL struck down gaming and 3D GPU specific changes to OpenGL. Microsoft did not want to have to create a different framework, but had to and walked away from OpenGL. OpenGL eventually 'got it' a few years later, and started implementing features from DirectX.
That's interesting. So, around when did those things happen?

I am aware of a few facts:
  • MS wrote an OpenGL stack for Windows NT. I'm pretty sure they had it at launch, back in 1993.
  • WinG launched in 1994, on Win 3.1.
  • Microsoft bought RenderMorphics in February 1995, presumably to help create D3D.
  • DirectX launched with Windows 95 (and superseding WinG), but didn't initially include Direct 3D.
  • MS added Direct3D as an update to Windows 95, because it wasn't ready at Win95's launch.
  • MS did not include OpenGL in Windows 95, due to licensing costs.
  • In 1997, MS partnered with SGI on the Fahrenheit Project.
  • MS eventually dropped their in-house OpenGL implementation, since hardware vendors each had their own and there was a lack of demand for a pure software renderer.
  • At some point (I thought it was late 2000's, but can't confirm), MS quit the OpenGL working group. I think this was after the transition to Khronos, which happened in 2006, but might've been a bit before.

As for the Fahrenheit Project, here's what Wikipedia says:

On December 17, 1997,[17] Microsoft and SGI initiated the Fahrenheit project, which was a joint effort with the goal of unifying the OpenGL and Direct3D interfaces (and adding a scene-graph API too). In 1998, Hewlett-Packard joined the project.[18] It initially showed some promise of bringing order to the world of interactive 3D computer graphics APIs, but on account of financial constraints at SGI, strategic reasons at Microsoft, and a general lack of industry support, it was abandoned in 1999.[19]

So, it's not obvious to me where in the timeline that sequence of events would fall.

I mean, the term GPU didn't even exist, back in the 90's. Nvidia only started using it in '99. The first 3D-capable graphics chips & cards only launched 1995 - 3D Labs and Nvidia. So, I don't really know what "3D GPU specific changes" they'd have been trying to make in 1994 or earlier.

I don't even know where to start with many of the other things you mentioned or assumed...
Just take them one at a time, please. "A long journey begins with the firs step."

Like comparing what Microsoft did with the XBox One X GPU to shoving tessellation into the GPU?

FFS. Microsoft added tessellation to the XBox 360 GPU in 2005.
This is not helpful. I gave that as an example of how thousands of calls could be combined into one. I thought that was pretty clear. I am aware that tessellation is not new.

Anyway, please give more details on what you mentioned, as it sounds interesting. At least give me one link, and I'll educate myself from there.

Anyway...
The stuff I mentioned is not hard to find, and I assumed to be more commonly known than apparently it is. However, anyone that is curious can find the references and information if they want. If you don't care enough to research it, it doesn't matter.
If you make points you're unable or unwilling to backup, then don't expect people to take your posts seriously.