kristoferstoll :
Give me a break... This author is living in LaLaLand if he actually believes any of this crap he wrote. AMD are so far behind Nvidia right now that there is NO HOPE that their next GPU could possibly compete with the RTX series. They would be lucky to beat the GTX1080ti tbh... And unless they have put serious development into RayTracing then they have no hope there either... Nvidia have brought us a decade ahead where RayTracing is concerned and they did it with VERY custom hardware. Tensor Core and RT cores... Where has anyone seen that AMD have anything like that in their pipeline? Nowhere, that's where. I get it the RTX series is expensive... It also has very cutting edge technology that is actually a deal for what it is... Titan V was $3000 and the RTX2080ti is 4.2x faster at Raytracing and 10-15% faster in rasterized gaming... It's expensive, true. But look at what it has brought to the table. People forget very quickly that before RTX RayTracing was still 10-15 years away...
Thanks for your feedback, Kris. I wanted to reply and set a few things straight for the record. For starters, I am not from LaLa Land - I hail from a famous shore town on the grizzled coast of New Jersey (I can literally see Dr. Weird's laboratory from the nearby beach). Although sometimes it
can feel like LaLa Land (the movie with Ryan Goseling and Emma Stone dancing and singing everywhere) with all the local theater companies and high school children skipping and flash mobbing about town singing musical numbers, it is a relatively sane place of origin.
To your points on the article, I do believe in the things I wrote (despite my geographical location), and I tried to provide tangible context for my assertions in the form of linked articles on the various topics I touched on. If I wasn't clear in the article, I'd like to clarify that AMD's current flagship, the Radeon Vega 64, is a direct competitor to the GeForce GTX 1080 (check out our
GPU Hierarchy chart) . The RTX 2080 is competitive with GTX 1080 Ti in rasterized game performance (check out
our review). It is completely reasonable to hypothesize that with a smaller lithography and improved architecture, AMD
could close the performance gap with it's next-gen graphics products.
As far as ray tracing is concerned, AMD has been a prominent figure in the professional market with Radeon Rays (which was formerly AMD FireRays) for some time now (I can't source its original launch date, but anything old enough to be referred to as "formerly known as" has likely been around for a few years). It was designed for the workstation content creator (ray tracing for movies for film and PC game scenes), but with recent upgrades to the software (that I detailed in the article), it seems to be moving in a direction where it could be adapted for real-time ray-tracing in PC games.
To say AMD has "no hope" with ray tracing, that ray tracing in games was "10-15 years away" otherwise, or that AMD is "so far behind Nvidia" in performance is outright inaccurate. Just because AMD's top-end GPU stops at a certain performance and price point in Nvidia's product stack doesn't mean AMD is far behind. It simply means AMD isn't focused on those price points. That may change because of Nvidia's price hike for its top end card (RTX 2080 Ti is almost double the previous gen 1080 Ti in price), but traditionally, AMD has been competitive in performance and price in every market segment below $500 (original MSRPs) for some time now (albeit somewhat late to the party, playing catch up to Nvidia).
To your point on Tensor and RT cores... you're not wrong. It
is cutting-edge technology. No other company has brought a consumer gaming GPU to market that has proprietary co-processors (which is a gross oversimplification of what they actually are, yes, but follow me here) that can perform ray tracing and deep learning algorithms in parallel with traditional rasterization. I am in no way taking away from that achievement when I say that ray tracing isn't a thing (yet). It simply needs to take off in other areas (more games with both the DLSS and RTX features, and getting featured in a mainstream console would also help with widening adoption) before the hardware is worth the purchase for a large portion of the PC gaming population. Ray tracing is no doubt
cool, but it's just not a priority for the majority of PC gamers...
yet.
I think the reason you haven't seen something like that in AMD's pipeline is because the company tends to keep its features open source.
Two quick comparisons:
1) Nvidia G-Sync requires displays outfitted with a proprietary chip to function; AMD's variable refresh rate display technology is open source and compatible monitors tend to be far less expensive because manufacturers don't have to buy a special co-processor from AMD.
2) You need Turing or Volta GPUs for ray tracing Nvidia's way, and those cards are more expensive than previous GeForce offerings largely
because of those proprietary features; Radeon Rays is open source and runs on the OpenCL 1.2 standard (so it doesn't need anything but the GPU shaders).
I'm not saying either implementation of those features are right, wrong, or better than the other. Nvidia is clearly ahead of the pack for real-time ray tracing, but AMD has a few cards (both proverbial and literal graphics cards) up its sleeve should it decide to go that route as well. The possibility of ray tracing taking off is wonderful for PC gaming enthusiasts. But wouldn't it be even
more wonderful if there were
two companies in the game?
Now that RTX is available and we see where it lands against the previous generation cards in traditional gaming performance and pricing, it's not outlandish to hope or conjecture that AMD, given the company's documented technologies, practices, and what we know about its currently available GPUs (everything that I detailed in the article), could and can compete in rasterized game performance at multiple price points with its next-gen graphics tech.
After that arrives, Intel will likely have something to say about discrete graphics cards, too. The cycle never ends.
Yes... my article depends almost entirely on the terms "if," "could," and "possibly." I was quite clear on that. It all boils down to what AMD brings to market and how it competes. But the company wasn't doing a terrible job of competing in price and performance (power consumption and heat, we know, could definitely be better for Radeon cards) in the GPU market to begin with. They just have less market share. Nvidia's focus on ray tracing (which attributes to the high price and only moderate gains for rasterized games) is an opportunity for AMD to catch up. All i did was lay out how and why it
could happen.
Thanks for reading!