Um, no. Nvidia approached MS about ray tracing, and the conjecture I heard is that MS jumped on it after seeing software developers starting to move away from Direct 3D, in favor of Vulkan. So, for Microsoft, DXR was a bid to win them back. And, from what I've heard, it worked.
And it's laughable that MS is driving Nvidia's deep learning tech, when their main market for that stuff is Linux-based and the main APIs and frameworks that people use are built directly atop CUDA.
You're ignoring the question of when other industry players get access to the new APIs. I have it on pretty good authority that AMD was left out in the cold, only getting late notice of DXR. In order to be competitive, they need the inside track, so they can have products ready at or around the time the new API is released.
This is contrary to how open standards work, where anyone can join a Khronos working group and everyone can participate in the meetings and has a vote on the decisions around new standards/revisions. So, no one gets an exclusive inside track.
You're completely missing the fact that software implementations are so much slower than hardware ray tracing that they're impractical to use for anything that was targeted at a hardware implementation. The main benefit of being able to implement it in software is just so software developers get early access, and can do some development & debugging on GPUs that don't have it in hardware.
You think AMD was left out in the cold? By Microsoft? On a technology there were showing to Intel, AMD, and NVidia over 3 years ago?
A technology that Microsoft DEMANDS to be present in AMD hardware for the XBox 2020 release?
A technology, that MIcrosoft has ALREADY given away to anyone that wants to go read how it works in hardware or software?
A technology Microsoft took to Khronos and worked with them along with NVidia's hardware so there were be proper OSS of the technologies in Vulkan?
A technology that Microsoft is already running on AMD GPUs for XBox 2020...
A technology that Sony is already running, in Vulkan, on AMD GPUs....
Um, I don't know your AMD source, but I think they weren't truthful.
Additional things...
AMD didn't have the hardware, not just the WinML/DXR acceleration technologies ready, but their main cores were too far behind Turing. This is their GPU catch up year, and should launch their GPUs like a rocket next year, and possible even do the RT features faster and better than NVidia, as they have Microsoft's hardware team giving them back information on Microsoft AMD GPU modifications. AMD was expected to have the features from the XBox One X GPU in this generation, and it didn't happen. By moving huge chunks of low level calls from software to the silicon in the XBox One X, the performance jump is impressive, with examples from developers where 1000 calls are now a single call.
AMD and Microsoft are weird allies, just as Microsoft's relationship with NVidia. However, neither company has been restricted from technology from Microsoft, as the whole point of designing a software platform and the necessary hardware, is to get as many companies as possible to implement it.
Giving of technology to support their platforms and the industry technologies is not something new to Microsoft. Even with DirectX, they have shared with everyone. Microsoft walked from OpenGL, because didn't want to touch gaming and didn't want to touch 3D GPU technologies. However, they gave key features of DirectX to NVidia to modify and OpenGL to use and modify, as it also encouraged new hardware in these directions by having more support from OpenGL. (The shader languages we use today were created by Microsoft, NVidia created a variation for their hardware, an exact copy of the Microsoft shader language was given to OpenGL to implement. Reference: User shader languages late 90s.)
As a person that has wandered between designed OS technologies to going and forth to the graphical and video industries since the late 80s, hybrid Ray Tracing is something to really look forward to and it is brilliant.
This is the 'graphical' jump we need from almost real to 'real' graphics, as right now we are still basically working with and getting games that have been hitting the limits of the 2012 technologies.
Even if it is just a flicker here, or a less shiny surface there, or a reflection from behind you, or a shimmer off the water with real caustics.... This are tiny things, but make a huge difference in realism.
Go look at the Doom RT demos and how in play how real it looks at times with the RT effects even with older textures and resolutions. Heck, look at 8bit Minecraft with RT enabled, it looks more 'real' than anyone thought was possible from just a few lighting/reflection/shadow effects.
Take and thanks the response. I apologize for resurrecting this thread/post, but I don't always get back here every week.