AMD SVP Talks Radeon's Ray Tracing Response

  • Thread starter Thread starter Guest
  • Start date Start date
HAHA... this is a no news article:

AMD: "we have a response to ray tracking, but were not going to tell you what that response is, how were going to provide it, or when it will be available." "However we are going to poopoo all over the idea of ray tracking to cover up the fact that we admittedly have nothing..."

Watch those stock prices fall.
 
I hope hardware ray tracing doesn't actually make it into AMD gaming cards. It is an entirely useless gimmick that's just gonna cost extra and remain unused by most games.
 
Gotta love the down votes, but guess what, I TOLD YOU SO... AMD's stock 19.3 point drop since this mornings announcement, 8.25%!

Hey AMD, not having an answer to ray tracing and covering up with a bunch of "we dont think the future is important poopoo" is bad news!
 
Dantee you should really look at the stock market trend today, perhaps you'd not come off as an NVidia apologist. While in time it will be in cards, they jumped the gun on it. Basically saying here it is and if you want one of our high end cards you have to pay for this feature, regardless if you have any games that will ever use it. They should have made a separate branch off, like a 2080 RT, or 2070 RT. Then those that wanted to jump into that tech could have while the rest get the standard performance increase without the ridiculous increase in cost.
 
Dantte, I hope you enjoy your RTX card.... until it fails on you. Sincerely, a GTX 1080 owner who still thinks ray tracing is absolutely useless at this stage of development
 
Ray tracing is not "Break though," it's old tech. The reason it is not used is in mainstream real time rendering is because of hardware overhead. And guess what? Today's $1000 Nvidia GPU's are still not fast enough practically implement it, unlike when tessellation went mainstream. There is no reason for AMD to pursue ray tracing for at least 1-2 more architecture generations. Nvidia made a critical mathematical error thinking this generation of their hardware was truly capable using ray tracing practically. I don't really blame them, they were trying to be cutting edge in mainstream rendering but failed. Next gen should be better for sure though.
 


Perhaps instead of calling it the 2080 RT they could call it the Titan V as an example. Price it in a range so they can gouge developers that actually can use RT but don't necessarily need it in real time. They would love the product if it increased their productivity.

For some reason whilst coming up with this unique idea I'm feeling deja vu.....
 
Waste of money, waste of time. Unless you are doing animation, there is no need for it. Look at God of War, Uncharted 4 and the Last of Us 2, and tell me Ray Tracing is absolutely required...

Let`s be honest, it is not and it is a propaganda tool.
 

I mean, I think it's too late now. Nvidia is going to pay- I mean, encourage developers to use realtime raytracing in their titles. As a result, eventually AMD will be forced to burn a bunch of die space on a dedicated DXR compliant RT block. It doesn't mean they need to rush headfirst into it, however. Even games that support some form of DXR don't require it, and won't for the foreseeable future. Frankly even the RT hardware in the 2080 Ti isn't good enough... hybrid engines are only using it sparingly and it still greatly hinders performance (compared to a pure rasterized title on the same GPU).
 
All hardware vendors are down today. There is a surplus of inventory at retailers. The market reacted accordingly.

Raytracing is actually a simpler method for achieving the same effects we get today. Deferred rendering needs to use tricks in order to produce the same results making them much more complicated to implement. The trade-off is of course real-time rendering. RTX isn't the first time I have heard of real-time ray tracing. It's been done for a decade. The drawback is the amount of resources it takes significantly reducing the other graphical elements. Dedicating hardware just to raytracing is dumb for a consumer card right now. Waste of resources for something that will probably be do-able by the same compute cores the rest of the GPU is using.

When we do get real-time raytracing, it will probably use the same method Sony/Blender uses with pathtracing as it would be the most efficient. It shoots a ray out for each pixel and lets it bounce to a light source if one exists.
 


Right. Because prior to Nvidia releasing their Turing cards, they were just as forthcoming as you're chiding AMD for being... no?
 

Nvidia actually supports this on their Pascal GPUs. According to their performance numbers a GTX 1080 Ti is only about 1/10th as fast as their RTX 2080 Ti (measured in gigarays/sec). That's the power of fixed-function hardware.

I do agree that it would be nice to find some kind of balance, like their Tensor cores, where the solution is just generic enough that you could possibly use it to accelerate structurally similar algorithms. Of course, that might already be the case but they just haven't yet publicized it (perhaps due to SDK support needing more time to mature).



This provides only some of the benefits available with Nvidia's current technology. Some of the most powerful and difficult to emulate effects supported by ray tracing are lighting-related. Indirect lighting and caustics, in particular. Also, soft shadows are easier to implement in ways that play well with other rendering techniques. Even motion blur and depth of field would require more than one ray per pixel.

About the only things you can get with one ray per pixel that are any better than traditional rasterizers are accurate reflections and refraction. However, as both are quite susceptible to aliasing, even then you'd want to supersample.
 
As with tensor cores, I think AMD was also caught off-guard by Nvidia's RT cores. If so, it will take them at least two generations before they have a chip early enough in the design pipeline to accommodate a response.
 


Nvidia is actually the one who will play catch up... because they have no Infinity Fabric. We've seen it during the presentation, Infinity Fabric second generation is now part of the GPU architecture.

Chiplet is the future and Intel/Nvidia are not even there at the start line.

Turing could have been done with Infinity Fabric, easier, cheaper and faster. Nvidia is about 3-4 years behind of their first chiplet design. Intel, at least 2 years before EMIB first implementation.

Ray Tracing is absolutely nothing for the game industry. It is just a cool concept... computing however...

 
Wouldn't it be wise if they just built RT chip in monitor to make it work or make pci card with RT without buying whole RT video card
 
Textures in 3d games was a gimmick too, way to cost intensive in the beginning.
Shadows in 3d games was a gimmick too, way to cost intensive in the beginning.
Shaders in 3d games was a gimmick too, way to cost intensive in the beginning.

Raytracing will be the same way, start of as a brave first step that will be way to cost intensive to be fully used but eventually that brave step will pay off. I hope AMD wont be to late to the game
 
Ray tracing is the future and we should embrace it. However, AMD and Nvidia need to recognize that developers are the key to us consumers embracing tray tracing. It's important that the two of them make sure to come up with a solution that works on both their products so that developers see a larger install base to build towards. I don't think separate-but-equal solutions are the best play here, especially with Intel intending to disrupt the GPU market in the next few years.
 
Anyone that says Ray-tracing is a gimmick has no idea how 3D graphics works. This is a step we should have taken 10 years ago. I fully support Nvidia in this step and I can't wait to see the games that embrace it! AMD/ATI are late to the game again. I do hope they catch up though.
 
Heh... Amd has used to be too ahead of the time by implementing features that Are usefull only much later and Nvidia has used to optimise to current DX architecture. So it is Nvidias time to play the wait game.
 
Ray Tracing is the future, but it will take a while for it to become mainstream. It's hard to see it getting a lot of support unless the next gen game consoles support it. I see a lot of console ports that don't even have 60 FPS on higher resolutions.