News AMD's RDNA 3 GPUs Gain Vulkan 1.3 Compliancy

If the top card AMD releases is even within 10-20% of the 4090 its going to have an MSRP of 1200 and maybe more. The lower tier card is probably going to be 1000 dollars. Cost is increasing at a rate of the performance increase as a percentage X2 it seems for the last 6 years.

If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
 
Last edited:
AMD is* probably asking Rocket Scientists and Brain Surgeons how to price this new batch of GPUs, because if they've been paying attention to nVidia's blunders (to call them something), their price will determine 50% of the reactions online, if not more.

Whether AMD wants to admit it or not, they do not have the mindshare nVidia has, so they have to be cauteous. And even more now, since they're definitely in the spotlight when a lot of people on the fence are waiting on AMD to woo them into their fold. This type of opportunities does not come often and they better execute it in a humble, yet aggresive way.

Again, this needs to be an HD4870 moment.

Regards.
 
Last edited:
  • Like
Reactions: prtskg and Kridian
If the top card AMD releases is even within 10-20% of the 4090 its going to have an MSRP of 1200 and maybe more. The lower tier card is probably going to be 1000 dollars. Performance is increasing at a rate of the performance increase as a percentage X2 it seems for the last 6 years.

If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
Hopefully means Amd will comfortably beat Nvidia with lower wattage in a generation or two. Unless Nvidia manages to sell a 1kw GPU at that time..
 
If the top card AMD releases is even within 10-20% of the 4090 its going to have an MSRP of 1200 and maybe more. The lower tier card is probably going to be 1000 dollars. Performance is increasing at a rate of the performance increase as a percentage X2 it seems for the last 6 years.

If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
It can perform as close to the 4090 as it wants, no DLSS equiv. = no point in buying it.

AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.

If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.

FSR ≠ DLSS!
 
It can perform as close to the 4090 as it wants, no DLSS equiv. = no point in buying it.

AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.

If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.

FSR ≠ DLSS!
Even so that is a feature and not a requirement. We will not know the viability of AMD's product until they release it and we see the numbers. My point still stands on pricing.
 
  • Like
Reactions: prtskg
It can perform as close to the 4090 as it wants, no DLSS equiv. = no point in buying it.

AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.

If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.

FSR ≠ DLSS!

Yes, uncountably.

Just like the number of digits I have on one hand. I tried to use the other hand to count them but it turns out that hand was no help as it is uncountable as well.


as far as no point buying it without DLSS equiv, you don't have to like it or the way it does its business, but FSR is in fact functional even if it's not as good as DLSS. G-sync is technically superior to Freesync as well but it's been far from a deal breaker. It's all going to come back to the performance and the cost.
 
  • Like
Reactions: prtskg
Yes, uncountably.

Just like the number of digits I have on one hand. I tried to use the other hand to count them but it turns out that hand was no help as it is uncountable as well.


as far as no point buying it without DLSS equiv, you don't have to like it or the way it does its business, but FSR is in fact functional even if it's not as good as DLSS. G-sync is technically superior to Freesync as well but it's been far from a deal breaker. It's all going to come back to the performance and the cost.
First of all, I have no idea what you're trying to say with the countability analogy.

Second of all, again you can't compare DLSS with FSR because they are completely different.
One is using supercomputers @ 16k and tensor cores, the other doesn't have any tensor cores and merely uses clever algorithms.

Lastly, ? ----> Gsync by itself has been dead for years, literally everyone has been using the (unlocked) drivers for ages, with which you can use Gsync on your freesync monitor.

There's no point in buying a non-dlss card aka AMD GPU unless you're getting a REALLY amazing deal for a card that performs like a higher tier card (for example getting a card that performs like a 3090 Ti 3090 | 3080 Ti for a massive discount).

Otherwise for a small price or performance difference, no. You're spending hundreds of dollars, you may as well get as much for your cash while you're at it.

It's all about price and performance. DLSS with a 3080 would be sweet but not at the current price range.
Example: The 6800xt is performing similarly as the 3080 for 700€ (sometimes the xt outperforms the 3080, othertimes the 3080 outperforms the xt, on avg they are identical fps-wise).

The 3080 costs 800€ at least.

Though the 6800xt has only recently dropped a bit in price, before that (at least on the price charts for my EU country), it basically costs almost 800€ where my point comes into play: Is the price difference really worth not having DLSS at that point. -> No, it can truly do wonders fps-wise many years onwards.
Not just that you have those tensor and RT cores.

People forget it's not just DLSS and cores. It's also the tech behind them, sdk's, pieces of code etc.
Blender renders with Optix (RT, Denoising) MUCH faster than with CUDA. That's what this fancy "raytracing" and DLSS movement has also solidified.

Again it depends on you and your situation and location.
I can totally see people not caring about DLSS or AAA and just getting the 6800 for 700€ aka much less than the 3080, but then again spend-wisely, wait for sales, it's your money after all.
 
Last edited:
At the end of the day, for around 90% of gamers, whats important, before anythign else, is if they have enough cash to buy a card or not.

If they do have the cash, then its time to look at the prices for both Nvidia and AMD, see what you can afford in that range, and just then check the add-in technologies like DLSS, FSR, Ray Tracing, DP 2.1, Vulkan 1.3.3.1, Freesync, Gsync, etc..

Some of them, a tiny group who cares to learn about a product before buying it, may be able to stretch a little more to get thier favorite brand, or the technologie they want/need in that regard. But for the bulk of gamers who have no idea if Nvidia is this or that, or if AMD have X or Y, then its all about money, what they can afford, and whats available on their part of the world.
 
  • Like
Reactions: helper800