News AMD Navi Graphics Cards May Be Cheaper Than We Thought

TCA_ChinChin

Honorable
Feb 15, 2015
423
105
11,090
31
This makes things more competitive, but its still not really a win for AMD. Turing has been out for a while now and even if RT cores aren't really as useful as Nvidia make them out to be, they are still an extra compared to Navi. At least it seems that power efficiency for AMD isn't 2 generations behind like it was and there are finally more (AMD) choices for a potential customer. Its certainly different, where the previous generation was an obvious price to performance leader compared to Nvidia, while this one tries to match.
 
Reactions: panathas

tennis2

Honorable
Nvidia lowers it's price/performance points a week ahead of AMD GPU launch.
AMD in-turn lowers it's price/performance points to match.

...shocker....

Nvidia has been without real competition in their Turing cards, so they're pushing prices as high as the market will tolerate. Once competition (eminently) appears, they can lower pricing without much pain. AMD, having less market share than Nvidia, and being behind on power consumption, needs to always sit below Nvidia on the price/performance curve, or they won't sell any product. Therefore, AMD looks to Nvidia to set GPU costs, and has to conform to that structure at slightly lower pricing.
 

InvalidError

Titan
Moderator
The RX690 made no sense at the original launch prices. I knew AMD would be compelled to drop prices shortly after launch, wouldn't be surprised if it preemptively dropped launch prices to avoid backlash from screwing people over on pricing almost immediately at launch. Still not convinced $350 is low enough for the bottom SKU, probably going to get another $50 drop within months from launch.

AMD should spend less effort on repeatedly shuffling its GPU brands to get people confused about the product stacks and pricing structures so it can attempt to raise prices some more without people raising hell over it and focus on actually getting people to buy its stuff.
 

alextheblue

Distinguished
Apr 3, 2001
3,078
106
20,970
2
Lower each one by yet another $75 again, and I'm all in! :) (My GTX1060 will keep on 'chugging' until I find a GTX1070 -equivalent for $250!)
The 1660 Ti is already pretty close at $279. Trades blows with the 1070. But I don't know if it's worth making a jump like that - personally I'd wait to upgrade until you've saved up a little more, and hopefully the price war shuffles a 5700 or 2060 Super down to $300.
Still not convinced $350 is low enough for the bottom SKU
What are you basing that on? If it edges out the same-priced 2060 Super (which already is a MUCH better value than the old 2060) then I'd say the price is good. IF - as in, waiting for the reviewseses.
This makes things more competitive, but its still not really a win for AMD.
In the business of selling widgets, what makes it a win or loss depends quite a bit on how many widgets you sell. Obviously that is going to be heavily influenced by price and performance, but my point is that whether or not this is a "win" for AMD is something that remains to be seen.
 
Reactions: TJ Hooker

InvalidError

Titan
Moderator
What are you basing that on? If it edges out the same-priced 2060 Super (which already is a MUCH better value than the old 2060) then I'd say the price is good. IF - as in, waiting for the reviewseses.
Simple marketing: Nvidia has a much stronger brand and much stronger features, which means for most average buyers, AMD is a no-go for similar features at similar prices. Add Navi's lack of ray-tracing to the equation and it becomes a tough sell.

Reviewers review products at launch only once. If the launch reviews say the pricing to performance and features makes no sense, your launch is f'd and your sales will suffer. AMD messed up pricing on the RX570/580, now it is struggling to get rid of them for $120-180 including free games or Xbox pass bundle that make the GPU itself nearly free if you care about the addons. AMD is practically writing those GPUs off at this point. If AMD messes up Navi's pricing, something similar may happen again.
 
Reactions: TCA_ChinChin

alextheblue

Distinguished
Apr 3, 2001
3,078
106
20,970
2
Simple marketing: Nvidia has a much stronger brand and much stronger features, which means for most average buyers, AMD is a no-go for similar features at similar prices. Add Navi's lack of ray-tracing to the equation and it becomes a tough sell.

Reviewers review products at launch only once. If the launch reviews say the pricing to performance and features makes no sense, your launch is f'd and your sales will suffer. AMD messed up pricing on the RX570/580, now it is struggling to get rid of them for $120-180 including free games or Xbox pass bundle that make the GPU itself nearly free if you care about the addons. AMD is practically writing those GPUs off at this point. If AMD messes up Navi's pricing, something similar may happen again.
Raytracing is an overblown feature at this stage. Games that use it heavily suffer a heavy performance hit, games that use it lightly might as well not bother (well, other than the Dev Cash they get). This is particularly true at the 2070 and under level. Even most of the fanboys seem underwhelmed by RT. The next generation or two should greatly improve capability, performance, and see more widespread use (maybe without even paying devs to use it). Hybrid rendering is definitely the future.

Nvidia's "stronger brand" isn't really as massive of a deal as you imply. Intel had an overwhelmingly strong brand pre-Ryzen (and they still do) - before Zen landed everybody and their mother recommended Intel everywhere but the lowest end. AMD's CPUs were even less popular than their GPUs are now. Ryzen's strong performance per dollar (despite not generally seizing the crown in the first two go-arounds) and positive reviews allowed them to turn it around. Much like Intel, Nvidia's name absolutely buys them a premium, but only to a point.

So that goes back to pricing vs performance... as we have no performance data, your opinion isn't grounded in hard data. If they offer better performance for the money, they'll get positive reviews, and they'll be OK. If they don't, they either cut price fast and promote hard, or suffer. But again, there's no way to know what the correct price IS, without first knowing performance across a broad swath of games.
 
Last edited:
Reactions: TJ Hooker

InvalidError

Titan
Moderator
Nvidia's "stronger brand" isn't really as massive of a deal as you imply.
It is when the other options have no compelling benefits. RTX may not perform particularly well on first-gen hardware but it is still one more checkbox people can play with out of curiosity. Given the coiice b4tween two similarly-priced and similarly-performing products, I'd rather have the one with extra features to play with.

As far as prices are concerned, they are way too effin' high on both sides as far as I am concerned. I want the mid-range back to $200-250.
 

alextheblue

Distinguished
Apr 3, 2001
3,078
106
20,970
2
It is when the other options have no compelling benefits.
Price is a compelling benefit, when coupled with equal or better performance.
As far as prices are concerned, they are way too effin' high on both sides as far as I am concerned. I want the mid-range back to $200-250.
Agreed. The midrange has shifted upwards too much. Although I don't know who to blame more, the GPU manufacturers and card vendors, or the game devs. They keep pushing to new graphical heights, but most of the engines don't scale back DOWN very well. So if you're playing on lesser hardware, and you tweak settings down, many games both run and look worse than older titles. There are some exceptions, well-crafted games that have eyecandy options but still offer a decent-looking and fast baseline.

At any rate this trend is going to push me into a $300+ purchase, I'm pretty sure. As much as I hate it. I won't even spend that much on a display.
 

InvalidError

Titan
Moderator
Price is a compelling benefit, when coupled with equal or better performance
The RX570, RX580, Vega56 and Vega64 beg to disagree. If you fail to capture mindshare, it doesn't matter how much faster your product is compared to the competition at a given price point, it'll stay on the shelves even after getting discounted to oblivion.
 

GetSmart

Prominent
Jun 17, 2019
167
38
610
0
AMD's Navi is going to be hard sell since lacking many of the next generation future technological features that NVidia's new Turing GPUs have (and to some extent Intel's new Gen11 integrated GPUs also have as well). Talking about tiled rendering, variable rate shading (known as coarse pixel shading on Intel's new Gen11) and ray tracing of course. Besides ray tracing, others like variable rate shading are already are being implemented in games like Wolfenstein II: The New Colossus.
 
Last edited:
Still not convinced $350 is low enough for the bottom SKU, probably going to get another $50 drop within months from launch.
That implies the 5700 is going to be their lowest-end part, which seems rather unlikely. I suspect we'll be seeing launches for a 5600, 5500 and so on over the coming months to fill in the lower price-points. They're likely starting with the higher-end to give their lower-end cards a bit more time to clear out, just as Nvidia did last fall.

AMD messed up pricing on the RX570/580, now it is struggling to get rid of them for $120-180 including free games or Xbox pass bundle that make the GPU itself nearly free if you care about the addons.
The launch prices of the RX 570 and 580 were fine. The cards were just a minor update to their RX 400 series counterparts, which had offered excellent value at the time they came out just 9 months prior. The 500 series might not have pushed performance-per-dollar any further, but I don't think anyone really expected much from a refresh so soon.

Unfortunately, the 500 series cards launched just as cryptocurrency mining was ramping up, and due to them offering better compute performance than the competition, prices only moved upward after launch, and took more than a year for them to get back down to where they should have been. The same happened with Vega. By the time mining subsided, and prices worked their way back down to purchasable levels, those cards were already over a year old, and I think many just decided to hold out for the next generation. The glut of leftover cards was probably mostly down to the collapse of the mining market, and clearly affected Nvidia too, which is likely why it took them two and a half years to begin launching a new generation of cards themselves.

As for bundled games, I doubt those cost AMD all that much. The Xbox game pass in particular is only 3 months, and can be thought of as an extended trial of a service that is currently not all that expensive to begin with. I suspect they get bargain-bin pricing on the other pack-in games as well.

They keep pushing to new graphical heights, but most of the engines don't scale back DOWN very well.
Sure they do. Just about any current games can get decent frame rates with max settings at 1080p resolution on a $200-$250 graphics card, and with settings dialed back a bit, even on significantly less expensive hardware. 1080p is arguably still "mid-range", and if one wants higher resolutions or refresh rates, they will naturally be looking at somewhat higher-end hardware around the $300+ range to accomplish that well.

If anything, most games don't scale UP all that well. Those buying high-end graphics card get a sharper image and/or moderately smoother frame rates than those buying a $200 graphics card, and that's about it. Developers have been designing their games to perform well on mid-range hardware and consoles, so high refresh rates and resolutions are about the only thing differentiating the high-end. I suppose that could actually be a good argument in favor of raytraced lighting effects, as they can potentially provide a more substantial benefit to visuals than just rendering more pixels or frames, at least past a certain point. If the upcoming generation of consoles pushes games to improve their visuals, we may see games making greater use of available PC hardware, and more reason for "mid-range" cards to offer more performance.
 

InvalidError

Titan
Moderator
That implies the 5700 is going to be their lowest-end part, which seems rather unlikely. I suspect we'll be seeing launches for a 5600, 5500 and so on over the coming months to fill in the lower price-points.
Nvidia has cards at lower price points too, AMD launching lower-end GPUs with a similar prices, similar performance, similar brand recognition deficit and similar marketable feature deficit won't help anymore than it will for the 5700.
 
Nvidia has cards at lower price points too, AMD launching lower-end GPUs with a similar prices, similar performance, similar brand recognition deficit and similar marketable feature deficit won't help anymore than it will for the 5700.
I guess that depends on how they perform, which we should have plenty of data for very soon. The initial prices that were announced did not impress me, but the updated pricing is arguably not too bad, assuming the performance is there. If the 5700 performs similar to a 2060 Super in most games for $50 less, that could make it a compelling option. The same goes for the 5700 XT. When the cards were revealed, I said they should have been priced $50 lower to be competitive, and we got that with the 5700 XT, and are at least most of the way there with the 5700.

Sure, you miss out on hardware raytracing support, but that's still something of an unknown variable at this point, even more than nine months after the 20-series launched. There are still only three proper games that utilize the hardware, and the performance hit is quite large relative to the improvement in visuals. There are promises of some other upcoming games supporting raytracing, but there's no telling how it will perform and how much benefit it will provide in those titles. Or even if the current RTX hardware will be enough to handle raytraced effects in games launching a couple years down the line.

And of course, that feature is not even available on Nvidia's cards in the sub-$300 range, so Nvidia doesn't currently have that going for them there. If AMD offers performance not far behind the original RTX 2070 for $350, it seems like they could also potentially offer performance not that far behind an RTX 2060 for around $250, so I wouldn't dismiss their unannounced mid-range offerings yet.

It also seems like the efficiency of AMD's new cards should be relatively close to that of Nvidia's, at least until Nvidia moves to a new process node in another year or so.
 
Reactions: alextheblue

GetSmart

Prominent
Jun 17, 2019
167
38
610
0
Sure, you miss out on hardware raytracing support, but that's still something of an unknown variable at this point, even more than nine months after the 20-series launched. There are still only three proper games that utilize the hardware, and the performance hit is quite large relative to the improvement in visuals. There are promises of some other upcoming games supporting raytracing, but there's no telling how it will perform and how much benefit it will provide in those titles. Or even if the current RTX hardware will be enough to handle raytraced effects in games launching a couple years down the line.
Actually there more than just three games that support ray tracing with more in the pipeline such as Watch Dogs: Legion, Cyberpunk 2077 and also DOOM Eternal. Looking at the (quick) rate of ray tracing implementation in games currently, that is going to be the norm in future games. Those next generation future technological features such as ray tracing (and variable rate shading) can become a rather big differentiator since potential buyers will have the option to test and play around with them.
 
Last edited:
Actually there more than just three games that support ray tracing with more in the pipeline such as Watch Dogs: Legion, Cyberpunk 2077 and also DOOM Eternal.
Yes, but those games are all many months away from release, with two of the three coming next year, and as I said, "there's no telling how it will perform and how much benefit it will provide in those titles." As far as major game releases go, there's only Battlefield V, Metro Exodus and Shadow of the Tomb Raider, and in those games, raytracing performance tends to be a bit questionable, especially on the cards around this price range.

As for the list on that site, they include Assetto Corsa Competizione as being "released", but as far as I know, RTX support isn't in the game yet. And the other "released" game is a small early access indie title described in user reviews as lacking content and being in a very early state of development. Oh, and apparently it only manages around 30fps at 1080p with raytracing enabled. I guess there's Quake II RTX as well, but considering its a game from the 90s that looks quite outdated from a graphics standpoint even with raytracing enabled, the poor performance is not at all justifiable. It's more a tech demo than anything.

Perhaps there may be a decent number of good games that support raytracing next year. There may also be new graphics cards capable of performing raytracing without bottoming out performance next year as well. Raytraced lighting effects might look a bit nicer, but the large performance hit on this first-generation hardware arguably prevents it from being a must-have feature quite yet.
 
Reactions: alextheblue

GetSmart

Prominent
Jun 17, 2019
167
38
610
0
Yes, but those games are all many months away from release, with two of the three coming next year, and as I said, "there's no telling how it will perform and how much benefit it will provide in those titles." As far as major game releases go, there's only Battlefield V, Metro Exodus and Shadow of the Tomb Raider, and in those games, raytracing performance tends to be a bit questionable, especially on the cards around this price range.
There are already reviews with ray tracing from the older NVidia GeForce RTX2070 such as Shadow of the Tomb Raider and Battlefield V already which shows often very decent playable frame rates (up to above 60 frames per second).

As for the list on that site, they include Assetto Corsa Competizione as being "released", but as far as I know, RTX support isn't in the game yet. And the other "released" game is a small early access indie title described in user reviews as lacking content and being in a very early state of development. Oh, and apparently it only manages around 30fps at 1080p with raytracing enabled. I guess there's Quake II RTX as well, but considering its a game from the 90s that looks quite outdated from a graphics standpoint even with raytracing enabled, the poor performance is not at all justifiable. It's more a tech demo than anything.
There are others on that list that have shown impressive performance (greater than 30 frames per second) with ray tracing enabled like Metro Exodus and Control. That Quake II RTX is more of a demonstration piece to show how easily ray tracing can be implemented into existing game engines including older ones.

Perhaps there may be a decent number of good games that support raytracing next year. There may also be new graphics cards capable of performing raytracing without bottoming out performance next year as well. Raytraced lighting effects might look a bit nicer, but the large performance hit on this first-generation hardware arguably prevents it from being a must-have feature quite yet.
As mentioned above, stll can get decent playable frame rates with ray tracing enabled on current NVidia GeForce RTX 2000 series GPUs, thus its a nice optional feature to have (for potential buyers to play with) and will probably be a must have feature (for all new generation GPUs). The only barrier now is the pricing since many of those NVidia GeForce RTX 2000 series GPUs are in the upper range ($399 and above).
 
Last edited:

alextheblue

Distinguished
Apr 3, 2001
3,078
106
20,970
2
The RX570, RX580, Vega56 and Vega64 beg to disagree. If you fail to capture mindshare, it doesn't matter how much faster your product is compared to the competition at a given price point, it'll stay on the shelves even after getting discounted to oblivion.
Vega was never really price competitive, hamstrung by large die AND HBM, right after the GPU mining craze crashed prices. The price floor for them was too high at the newly restored pricing. RX570/80 on the other hand ended up in a lot of systems, I think sales were pretty decent actually. But that's one GPU die that was somewhat competitive in a huge market. It wasn't enough, plus the didn't have the performance to compete in the New Mid Range, they are entry-mid at best, and they have stiff competition. If you look at desktop discrete graphics sales only, I think 570/580 did well in the entry-mid segment it occupied. AMD was really getting murdered everywhere else. Navi brings them up a segment again but this time they're priced competitively at 10-11% higher performance. The smaller die and conventional memory gives them space if Nvidia drops prices. I'd still absolutely prefer them a little cheaper just for my own good, but I think they'll do OK.
AMD's Navi is going to be hard sell since lacking many of the next generation future technological features that NVidia's new Turing GPUs have (and to some extent Intel's new Gen11 integrated GPUs also have as well). Talking about tiled rendering, variable rate shading (known as coarse pixel shading on Intel's new Gen11) and ray tracing of course. Besides ray tracing, others like variable rate shading are already are being implemented in games like Wolfenstein II: The New Colossus.
VRS should be interesting, especially for VR. But uh, "next generation future technological features"? Raytracing is ancient tech, it's just now that we're able to really start harnessing it - and if you use it heavily it STILL cripples performance. We're stuck with hybrid rendering for the foreseeable future. Tiled rendering? Hello PowerVR? I still own a tile-based renderer in my Dreamcast, and that's not even the earliest example.
If anything, most games don't scale UP all that well. Those buying high-end graphics card get a sharper image and/or moderately smoother frame rates than those buying a $200 graphics card, and that's about it.
I was referring more to older / low-end hardware and APU-tier graphics.
There are already reviews with ray tracing from the older NVidia GeForce RTX2070 such as Shadow of the Tomb Raider and Battlefield V already which shows often very decent playable frame rates (up to above 60 frames per second).
At what resolution and settings? 60hz displays are also not exactly gaming grade. If you get 100+ FPS at the same settings and res with RT disabled, that's a BIG hit. There's also the fact that the visual results of the RT are different in every game. Some usage is subtle, others more dramatic but costly. There's no single answer.
 
Last edited:

GetSmart

Prominent
Jun 17, 2019
167
38
610
0
VRS should be interesting, especially for VR. But uh, "next generation future technological features"? Raytracing is ancient tech, it's just now that we're able to really start harnessing it - and if you use it heavily it STILL cripples performance. We're stuck with hybrid rendering for the foreseeable future.
Ray tracing has been around for a very long time which is usually done with CPUs but those are not fully real time rendering since its usually very CPU intensive hence render farms were often used. However implementing ray tracing into hardware for real time rendering is a very recent technological step in GPU development. Dedicated ray tracing hardware will always be much faster than hybrid rendering (which is a combination of software and hardware shaders). This example Pascal Ray Tracing Tested! GTX 1080 Ti vs RTX 2080/ RTX 2060 + More nicely illustrates the differences. Dedicated ray tracing hardware can maintain better than playable frame rates.

Tiled rendering? Hello PowerVR? I still own a tile-based renderer in my Dreamcast, and that's not even the earliest example.
There are several types of tiled rendering, but the ones implemented in both NVidia's new generation GPUs and also Intel's new Gen11 integrated GPUs are much more advanced than the old ones (originally developed by NEC). Mainly to increase performane and power efficiency, which is why this method is often used in ultra low power mobile GPUs such as ARM's Mali, Qualcomm's Adreno and of course Imagination's PowerVR (which is still alive under a different company).

At what resolution and settings? 60hz displays are also not exactly gaming grade. If you get 100+ FPS at the same settings and res with RT disabled, that's a BIG hit. There's also the fact that the visual results of the RT are different in every game. Some usage is subtle, others more dramatic but costly. There's no single answer.
The resolution (which is up to 4K) and settings (which is ultra preset) are in those links. Of course the frame rates will take a hit, but it does not turn into a slide show and in fact (as mentioned above) still can get decent playable frame rates with ray tracing enabled.
 
There are already reviews with ray tracing from the older NVidia GeForce RTX2070 such as Shadow of the Tomb Raider and Battlefield V already which shows often very decent playable frame rates (up to above 60 frames per second).
What I'm saying is, it's questionable whether you would want to nearly halve your frame rates to enable these effects. Even going by the review you linked to, in Shadow of the Tomb Raider the 2070 Super dropped from an average of 89fps with standard "high quality" shadows, down to 52fps with raytraced shadows enabled at 1440p. And that's just on average. They don't have minimums listed, but they should be lower still, undoubtedly making performance noticeably choppy at times. For such a relatively subtle effect, it doesn't seem worth it. And in Battlefield V, average performance dropped from 96fps down to 50fps with raytraced reflections enabled at 1440p. In a multiplayer shooter, that's arguably unacceptable for some moderately nicer reflection effects. And that's with a card priced $100 higher than a 5700 XT. At this resolution, you are looking at average frame rates in the 40s for the 2060 Super, and in the 30s for the 2060. 1080p is somewhat more usable, but I suspect that most of those buying a graphics card in this price range are either targeting a higher resolution, or high refresh rates.

It's certainly not a bad extra feature for those willing to cut resolution or put up with low frame rates, but the question becomes whether its worth paying an extra $50+ for, or alternately, whether it's worth getting around 10% less performance on average than competing hardware available at the same price when the feature isn't being used. If these cards had around double the RT cores and could push 60fps at 1440p with raytracing enabled in these games, I would say sure. If I had to guess, next-year's cards probably will. As for the current RTX cards, while they may feature hardware raytracing support, it's not powerful enough to keep you from questioning whether it's even worth enabling in the handful of games that currently make use of it, and it's possible that the performance hit could be even larger in games targeting faster RT hardware a couple years from now.
 

InvalidError

Titan
Moderator
As for the current RTX cards, while they may feature hardware raytracing support, it's not powerful enough to keep you from questioning whether it's even worth enabling in the handful of games that currently make use of it, and it's possible that the performance hit could be even larger in games targeting faster RT hardware a couple years from now.
The more likely outcome is that games will implement less compute-intensive RT-enhanced effects instead of brute-force RT. Things like smoke clouds could really use some more realistic lighting and RT to make cloud lighting, highlights and shadows does not need to run at anywhere near photorealistic resolution to still make them look more natural.
 

GetSmart

Prominent
Jun 17, 2019
167
38
610
0
What I'm saying is, it's questionable whether you would want to nearly halve your frame rates to enable these effects. Even going by the review you linked to, in Shadow of the Tomb Raider the 2070 Super dropped from an average of 89fps with standard "high quality" shadows, down to 52fps with raytraced shadows enabled at 1440p. And that's just on average. They don't have minimums listed, but they should be lower still, undoubtedly making performance noticeably choppy at times. For such a relatively subtle effect, it doesn't seem worth it. And in Battlefield V, average performance dropped from 96fps down to 50fps with raytraced reflections enabled at 1440p. In a multiplayer shooter, that's arguably unacceptable for some moderately nicer reflection effects. And that's with a card priced $100 higher than a 5700 XT. At this resolution, you are looking at average frame rates in the 40s for the 2060 Super, and in the 30s for the 2060. 1080p is somewhat more usable, but I suspect that most of those buying a graphics card in this price range are either targeting a higher resolution, or high refresh rates.
The ray tracing feature is an option allows users to choose between high frame rates or more realistic visual quality. As for the minimum frame rates, can check here NVIDIA GeForce RTX Ray Tracing In Battlefield V Explored Pre And Post Patch which is again better than playable frame rates. Heck even certain AAA games on current generation consoles have much lower frame rates than those, such as "cinematic" 24 to 30 frames per second fixed (and without ray tracing). And the ray tracing effects are often not subtle (and sometimes useful). A quote from that article..
For really skilled players, however, enabling RTX may even give an unfair advantage because they may be able to see reflections of other approaching players while hiding in cover. This point will be up for argument of course, but we felt it was worth mentioning nonetheless. At the very least, there is no denying that RTX clearly enhances the visual impact of the game and is a welcome step forward in in-game realism.
With ray tracing, that will be a whole new level of gaming altogether.

It's certainly not a bad extra feature for those willing to cut resolution or put up with low frame rates, but the question becomes whether its worth paying an extra $50+ for, or alternately, whether it's worth getting around 10% less performance on average than competing hardware available at the same price when the feature isn't being used. If these cards had around double the RT cores and could push 60fps at 1440p with raytracing enabled in these games, I would say sure. If I had to guess, next-year's cards probably will. As for the current RTX cards, while they may feature hardware raytracing support, it's not powerful enough to keep you from questioning whether it's even worth enabling in the handful of games that currently make use of it, and it's possible that the performance hit could be even larger in games targeting faster RT hardware a couple years from now.
In the coming months and next year, there will be many new games that are going to use ray tracing. This includes future generation consoles, thus it will a useful feature to have when the console games are ported over to the PC platform. Perhaps spending a bit extra on a much faster GPU with next generation features (especially that NVidia GeForce RTX 2070 Super) would be viable. Heck the old NVidia GeForce GTX 1080 Ti is still an excellent future-proof GPU despite being 2 years old. Big difference now is that, for $499 already can get the performance of such an expensive GPU (which costs $699 at launch). As mentioned above, the main barrier is the price and $499 is rather expensive for most ordinary gamers however those who could afford the cost then its pretty much worth it.
 

ASK THE COMMUNITY

TRENDING THREADS