AMD SVP Talks Radeon's Ray Tracing Response

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Polypheme
Ambassador

They have NVLink/NVLink2. With Volta, it gives them 150 GB/sec of bandwidth (each direction) vs. 100 GB/sec that AMD has said 7 nm Vega can support.

That said, I prefer AMD's PCIe-based approach to Nvidia's proprietary solution. But let's try to be realistic.


Hmm... Intel has EMIB and whatever Cascade Lake AP uses. Both Intel and Nvidia have shipped solutions using HBM/HMC. I don't know what else you really need, in order to believe they can go this direction if they wanted.


Are you fabricating this out of thin air? If not, then what's your source?

I disagree that Turing would be either faster or more energy-efficient as a multi-chip solution. The only upside of going that route would be cost.


Really? I thought their Kaby Lake-G processors used EMIB.
 

bit_user

Polypheme
Ambassador

Huh? If Turing can barely manage ray tracing at 1080p (according to rumors), what makes you think it was technically feasible to do before now - let alone 10 whole years ago?
 

bit_user

Polypheme
Ambassador

Well, I think that's MS' rationale for introducing DXR. Also, there are equivalent Vulkan extensions being worked on.

My main concern is how broad Nvidia's patents are. Do they leave enough room for AMD to offer a competitive solution within those same APIs?
 

bit_user

Polypheme
Ambassador

Woah. Just how far back are you going? Textures and shadows predated 3D hardware. The first-gen 3D gaming cards (and consoles!) supported textures. Half-decent shadows probably required multi-texturing, which came soon after.


Granted, shaders took a little while to happen, even though the idea goes way back.
 

bit_user

Polypheme
Ambassador

I doubt you could easily or cheaply separate it. A pure ray tracing chip would require a lot of memory bandwidth and fast communication with the main GPU, probably requiring it to be on the same card. And that card would probably cost even more than existing Turing GPUs.
 

No need for rumors anymore. DXR effects have been tested in Battlefield V by some tech review sites, and the results are most impressive...ly bad...

https://www.youtube.com/watch?v=SpZmH0_1gWQ

Want your new $800 RTX 2080 to perform like a $200 graphics card? Just set DXR effects to "Low".

4K, you say? How does a smooth 30fps average sound? Or with the effects on "High", you'll get closer to 20fps for that competitive edge. These marginally lower frame rates with RTX ON are truly a small price to pay for somewhat nicer reflections! And just think, once Tomb Raider gets RTX support added in, there will be raytraced shadows! Maybe with the next generation of cards there can be both at the same time!