There is just
so much wrong with this post. I'll have to break it down bit by bit.
I'm not optimistic. AMD hasn't been competitive for a few years, ever since Nvidia dropped Pascal on us. Turing was 'meh', but Nvidia got away with it because AMD wasn't there to compete with them tier for tier, and dollar for dollar.
100% TRUE - Because of AMD's near-bankruptcy, ATi had ZERO money for creating a new architecture and was forced to use GCN for far longer than it should have. Fiji should have been the last iteration of GCN because it was the last time that GCN was really competitive.
The rumors I have been seeing online all seem to point to Big Navi being equal to, or faster than Ampere, but at a cost of greater power consumption.
100% FALSE: It has been shown that it's about the same. I don't know what you've been reading but:
"So we see that the actual power consumption of the Navi21 XT should be about the same level as that of the GeForce RTX 3080! "
- Igor's Lab
Link to Igor's Lab article
Igor's Lab has been the go-to for hardware leaks lately. Where have you been getting your information?
EDIT: As we all saw, AMD actually claims lower power numbers compared to nVidia. They've been right about everything else so far so I see no reason to disbelieve them now.
If power consumption isn't an issue for you, then it's awesome
If power consumption IS an issue for you, then you won't be interested in Ampere either, especially since (as shown) they both have the same power draw.
AMD is also late to the raytracing party
No, nVidia was just early. Ray tracing is not a major feature in games but a frill at best. Most games do not implement ray-tracing. Having ray tracing performance that matches the RTX 2080 Ti would be more than enough.
, so who knows if their implementation of it is compatible with Nvidias?
Who cares about ray tracing, a technology that is so in its infancy that it's not even truly relevant yet? I can tell you right now that it won't be compatible with nVidia's tech because nVidia makes all of their tech proprietary to soak people more while ATi doesn't. The tech that ATi WILL be using will be DX12 (which AMD helped to create with Mantle), a technology that nVidia better get on board with because DX12 is the standard, RTX is not and never was.
Considering that ALL console games will be using AMD and ATi hardware, whatever tech nVidia is using will be completely irrelevant. Do you really think that Microsoft or Sony will gave a rat's posterior how a game runs on a GeForce card when their consoles are using Radeon GPUs? Here's a hint, they don't.
I think that you severely overestimate nVidia's power over a market that is actually owned by Microsoft (because, Windows and DirectX).
Remember that nVidia adapts to support DirectX, DirectX does NOT adapt to support nVidia.
Nvidia also has technologies unique to them that AMD won't be able to use like GDDR6X memory
I missed your post where you were crying that the RTX 3070 also doesn't use GDDR6X. Oh, that's right, you didn't make one. Why is that?
and DLSS. DLSS is particularly impressive because it allows a less expensive card to run at higher resolutions than it would normally be capable of in native mode.
I'm quite certain that everyone here is quite aware of what DLSS 2.x is capable of. We are, after all, tech enthusiasts. However, these are uber high-end cards being unveiled today. Something like DLSS would be more useful in the mid-to-lower end for cards that can't do higher resolutions like 1440p and 2160p natively to begin with. If a card performs well at 4K natively, why bother with upsampling? Upsampling is irrelevant at this level.
You can play in 4K using DLSS rendering at 1080p and the quality is impressive, judging by the stills and videos that are being shared. There are losses, yes, but they are not so noticeable that they would completely ruin the 4K experience. So you can play in 4K using a cheaper card than what you would require on the AMD side, making Nvidia an even better value proposition.
Paul at RedGamingTech has reported that ATi does implement a feature that is more or less the same as DLSS (under a different name). From what Paul reports, the picture quality will be somewhat inferior (he doesn't know to what degree) but the resulting frame rate will be higher (again, he doesn't know to what degree). He also said that, unlike DLSS which only works on some games (that support DLSS), ATi's technology should work on ALL games. I'm guessing that, again, ATi's tech will probably use a DirectX implementation for their upsampling an ALL games support DirectX.
You know, you're really projecting a bit of a frantic state of mind over this. You're also throwing out misinformation, wrong impressions and a seeming belief that nVidia is king and they rule all. You're also talking about ray tracing like Jensen Huang would by parroting what nVidia's marketing says while completely ignoring the reality of the situation.
You're one of three things. You're either an nVidia employee, an nVidia fanboy or a noob who really doesn't know better. Regardless, all of your "concerns" have been rendered moot now. You can be optimistic again.