News Nvidia Announces More RTX and DLSS Games at Gamescom 2021

"However, Nvidia is riding the momentum of being the first adopter, leveraging its relations with game developers and game engines to bring DLSS support to as many titles as it can before its competitors flood the market with their own solutions".

Ah, you mean strong arming them and telling them about their "editorial direction" when choosing technologies to use or no big fat marketing paycheck? Of course. Very nVidia-like.

Looking forward to an ARM future with them in charge... Not.

Regards.
 
  • Like
Reactions: VforV

Blacksad999

Reputable
Jun 28, 2020
70
48
4,570
83% of the market owns a Nvidia GPU, so it would make sense to cover something that's relevant to...83% of people. It's not some grand conspiracy. XD
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
83% of the market owns a Nvidia GPU, so it would make sense to cover something that's relevant to...83% of people. It's not some grand conspiracy. XD
Rofl, sure 83%, but out of that 83% do you know how many can use RTX and DLSS, how many actually have Turing and Ampere to use them?

About 20%! That's how many, they are still the minority.

The rest of the nvidia users, which are the majority, are on older generations and are left in the dust...

So, thanks AMD for FSR and soon intel for XeSS, which work on older GPUs. nvidia can shove their RTX and DLSS where the sun don't shine.
 
Rofl, sure 83%, but out of that 83% do you know how many can use RTX and DLSS, how many actually have Turing and Ampere to use them?

Correct.

So, thanks AMD for FSR and soon intel for XeSS, which work on older GPUs. nvidia can shove their RTX and DLSS where the sun don't shine.

DLSS 2.0 is already here and demonstrated very good results. Can't say that about FSR now. No Intel Arc hardware was given to reviewers yet so XeSS is still floating in clouds. Although it is good to know that competition to DLSS 2.0 is coming. And I hope that with Intel appearance on serious GPU stage GPU prices at last will slide down to bearable level.

I'm actually curious who from these three elephants will release working asset loading from NVMe drives into GPU memory. I mean RTX IO + DirectStorage solution which Nvidia is cooking for 2 years already.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Correct.



DLSS 2.0 is already here and demonstrated very good results. Can't say that about FSR now. No Intel Arc hardware was given to reviewers yet so XeSS is still floating in clouds. Although it is good to know that competition to DLSS 2.0 is coming. And I hope that with Intel appearance on serious GPU stage GPU prices at last will slide down to bearable level.

I'm actually curious who from these three elephants will release working asset loading from NVMe drives into GPU memory. I mean RTX IO + DirectStorage solution which Nvidia is cooking for 2 years already.
You mean an elephant, a rhino and a zebra, right? :p

Joke aside, FSR already demonstrated good results too, if you count the good things as:

  1. Better than DLSS 1.0 was at launch, much better.
  2. Implemented in more games than DLSS 1.0, many more (DLSS 1.0 had zero at launch and for 2 years less than what FSR has already).
  3. Although slightly lower IQ than DLSS 2.0, but still good enough at 1440p and 4k to not matter the difference.
  4. Sharper image vs DLSS and better in motion and not introducing extra artifacts on the image from its side.
  5. Lower overhead than DLSS, which results in less extra input lag than DLSS over the native image. So great for latency sensitive games.
  6. Available for all GPUs from the past 7 (10?) years, not only on 2 expensive RTX generations, like DLSS is.
  7. Open source and much much easier and faster and cheaper to implement.
Hey, would you look at all those FSR pluses.

Between FSR and XeSS (backed up by Intel money) I have no worries that DLSS, will become less and less relevant in the next 1 to 2 years, unless nvidia goes open source too or has some new magic ace up their sleeve...
 
You mean an elephant, a rhino and a zebra, right? :p

Of course :D

... I have no worries that DLSS, will become less and less relevant in the next 1 to 2 years ...

Artifical upscaling like DLSS is from Nvidia will only become more and more popular for performance reasons in complex scenes at large resolution. Especially when AAA titles will enjoy it for real and later - when 8K screens in consumer console market for rich and dumb will go mainstream.

Particularly I have no fixation on GPU brand. I want DLSS 2.0 like upscaling, ray tracing and possible RTX IO compatible asset loading boost together with quiet cooling and working drivers regardless who from these three will release that first. I'll continue to wait while price + VAT (I live in Europe) for card like RTX 3080 or equal AMD/Intel alternative will slide down below 900€. No reason to overspend IMHO.

One thing though which kinda deter me from AMD is video encoding quality in hardware. I occasionally need that feature. 5700 XT had bad artifacts in hardware encoded videos.
 
Last edited:
Artifical upscaling like DLSS is from Nvidia will only become more and more popular for performance reasons in complex scenes at large resolution. Especially when AAA titles will enjoy it for real and later - when 8K screens in consumer console market for rich and dumb will go mainstream.

Particularly I have no fixation on GPU brand. I want DLSS 2.0 like upscaling, ray tracing and possible RTX IO compatible asset loading boost together with quiet cooling and working drivers regardless who from these three will release that first. I'll continue to wait while price + VAT (I live in Europe) for card like RTX 3080 or equal AMD/Intel alternative will slide down below 900€. No reason to overspend IMHO.

One thing though which kinda deter me from AMD is video encoding quality in hardware. I occasionally need that feature. 5700 XT had bad artifacts in hardware encoded videos.
There is always a point of "stupidity" where you're dedicating more hardware to faking upscaling than actually producing better images. This is the "stupidity" I find in nVidia using Tensor cores in consumer instead of actually using general purpose calculations for their solution. Particularly, and this is something that I'm sure some people has asked, but no one knows for sure: what if those Tensor cores are just utilized as regular Stream processor space? Raster would work even faster than now for sure and RT maybe won't be as fast, but it would still work plenty fast, no?

History tells me that Intel will try and push the same division as when MMX was introduced. AMD and nVidia should actually be worried, but nVidia more than AMD. If they double down on Tensor cores for consumer graphics, there will be an even bigger gap they'll have to account for. Definitely an inflection point for them, but I'm sure they'll try to shoehorn Tensor cores until they just can't. Like PhysX.

Regards.
 
  • Like
Reactions: VforV

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Rofl, sure 83%, but out of that 83% do you know how many can use RTX and DLSS, how many actually have Turing and Ampere to use them?

About 20%! That's how many, they are still the minority.
The main group of people that will benefit from upsampling are gamers using ray tracing or 4k gamers. Nvidia likely controls more than 90% of that market.
 
There is always a point of "stupidity" where you're dedicating more hardware to faking upscaling than actually producing better images. This is the "stupidity" I find in nVidia using Tensor cores in consumer instead of actually using general purpose calculations for their solution. Particularly, and this is something that I'm sure some people has asked, but no one knows for sure: what if those Tensor cores are just utilized as regular Stream processor space? Raster would work even faster than now for sure and RT maybe won't be as fast, but it would still work plenty fast, no?

History tells me that Intel will try and push the same division as when MMX was introduced. AMD and nVidia should actually be worried, but nVidia more than AMD. If they double down on Tensor cores for consumer graphics, there will be an even bigger gap they'll have to account for. Definitely an inflection point for them, but I'm sure they'll try to shoehorn Tensor cores until they just can't. Like PhysX.

Intel also use specific matrix multiplicator cores in upcoming Arc GPU architecture. Only Intel's cores are called XeSS :)

Intel is also using dedicated Xe-cores in its upcoming GPUs to power its XeSS technology, with dedicated Xe Matrix eXtensions (XMX) matrix engines inside to offer hardware-accelerated AI processing.
Source: https://www.theverge.com/2021/8/19/...ss-super-sampling-ai-architecture-day-preview

Seems battle in gaming area is now going around performance in 4K resolution and subsequent screen streaming support. With my 1440p monitor which I expect to use for at least next 5 years, I'm not complaining.
 
Intel also use specific matrix multiplicator cores in upcoming Arc GPU architecture. Only Intel's cores are called XeSS :)
Yes, that was my point. Intel will also introduce specialized hardware for AI on graphics cards to accelerate their upsampling stuff. Difference lies with AMD having cross-licencing with Intel and nVidia burned their bridge some time ago. That is why I have the suspicion nVidia is on the losing side here.

AMD is in a really strange position for this "upscaling" battle of sorts, but not in a bad place IMO. I'm sure they're just warming up to Intel's side as they've already had a nod from Intel with FSR.

Regards.
 
  • Like
Reactions: VforV

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
The main group of people that will benefit from upsampling are gamers using ray tracing or 4k gamers. Nvidia likely controls more than 90% of that market.
Sure, but like I said, all these people that are the target gamers for 4k and RT are still the minority...

Also, if Intel keeps their promises with XeSS they said in an extended interview, that they are seriously working on XeSS to work below 1080p (such as 720p and 576/480p) to look as good when upscaled to 1080p, like 1080p looks upscaled to 4k. And they can do that because of AI, which FSR lacks.

They said that this is very important for them because the majority of gamers are those that still play on laptops and low to mid tier GPUs and they need XeSS to work good on those lower resolutions.

So if this goes well everyone will benefit, not just 4k elitists, which again are the minority.
 
  • Like
Reactions: Krotow
If Intel and AMD may benefit from developing FSR from one side and XeSS from another together and that will bring us better and cheaper hardware, why not.
More to that: AMD knew FSR was just something they had to get due to market pressure and they, while doing a great job with it, I don't think it's something they wanted to dedicate exclusive hardware for. That's why they made it fully open and use licensed algorithms for it: so it's dirt cheap to implement and almost a no brainer. They'll probably keep the development for it, but I don't think it's ever going to be a 1:1 competitor of neither DLSS or XeSS. If they go away with it's strongest point, which is the easy implementation and no-overhead, then it would be a very expensive up-hill battle. Also, dedicating too much hardware for it would be stupid, as I already mentioned before.

Regards.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
They said that this is very important for them because the majority of gamers are those that still play on laptops and low to mid tier GPUs and they need XeSS to work good on those lower resolutions.

So if this goes well everyone will benefit, not just 4k elitists, which again are the minority.
That's about 5 year old data. You can usually tell laptop gamers on steam by the screen resolution of 1366x768. For a long time that was the most popular resolution. However, that was a long time ago. 1366x768 has tumbled to single digit market share on Steam. 1440p has actually surpassed it. Looking at GPU market share, there is no IGP with a higher market share than 0.90%. The RTX 3080 is at 0.85%, above all but one Intel iGPU, while the 3070 is at 1.53%. It is a fallacy to claim the majority of gamers are on lowend hardware. Pretty much every GPU in the top 20 except maybe the 1050 are perfectly capable of gaming at 1080p and don't need up sampling. If you're trying to game on an iGPU, 1) you're not a serious gamer, 2) you're still going to be struggling at 1080p even with these up sampling techniques.

Again, there is no card without ray tracing hardware acceleration that will be able achieve playable framerates with up sampling. Not one. It makes no difference if FSR can be used on an RX480, you're still not going to be able to use ray tracing. DLSS wasn't designed from the beginning to be a universal performance booster, it was designed to be used in tandem with ray tracing to get playable framerates. The reality is that despite what the marketing has you falsely believing, there is only a small part of the market that truly benefits from these techniques.
 
  • Like
Reactions: Krotow
Again, there is no card without ray tracing hardware acceleration that will be able achieve playable framerates with up sampling. Not one. It makes no difference if FSR can be used on an RX480, you're still not going to be able to use ray tracing. DLSS wasn't designed from the beginning to be a universal performance booster, it was designed to be used in tandem with ray tracing to get playable framerates. The reality is that despite what the marketing has you falsely believing, there is only a small part of the market that truly benefits from these techniques.

I can only agree with this. DLSS now changed from a crutch for ray tracing into whole separate thing for performance gain in higher resolutions on weaker hardware. I already see how Intel may benefit from that when a half of budget and business related laptops may be also turned into casual gaming machines capable to run things like Far Cry 5 and Borderlands 3 with acceptable FPS for not so big additional price. Like in old software rendering days at nineties when if CPU and RAM was enough, game performance was adequate everywhere. People will like that.
 
Last edited:

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
That's about 5 year old data. You can usually tell laptop gamers on steam by the screen resolution of 1366x768. For a long time that was the most popular resolution. However, that was a long time ago. 1366x768 has tumbled to single digit market share on Steam. 1440p has actually surpassed it. Looking at GPU market share, there is no IGP with a higher market share than 0.90%. The RTX 3080 is at 0.85%, above all but one Intel iGPU, while the 3070 is at 1.53%. It is a fallacy to claim the majority of gamers are on lowend hardware. Pretty much every GPU in the top 20 except maybe the 1050 are perfectly capable of gaming at 1080p and don't need up sampling. If you're trying to game on an iGPU, 1) you're not a serious gamer, 2) you're still going to be struggling at 1080p even with these up sampling techniques.

Again, there is no card without ray tracing hardware acceleration that will be able achieve playable framerates with up sampling. Not one. It makes no difference if FSR can be used on an RX480, you're still not going to be able to use ray tracing. DLSS wasn't designed from the beginning to be a universal performance booster, it was designed to be used in tandem with ray tracing to get playable framerates. The reality is that despite what the marketing has you falsely believing, there is only a small part of the market that truly benefits from these techniques.
Ok, maybe the 720p numbers are lower (apparently less than 10% based on Steam Survey), but how about 1080p?

If you disregard 1080p resolution and the usefulness of FSR and XeSS which will work (better than on 720p) on many GPUs and depending on the game and user it will have acceptable loss in quality for improved performance, if you disregard that, then you are just an ignorant.

There are still over 50% of all gamers that play at 1080p and people somehow think that 1440p, yet even more crazy 5k is the majority.... NO they are not. Maybe in a few years, not now.
Here, educate yourself > https://store.steampowered.com/hwsurvey?platform=pc
And here is the analysis >
  • Steam says that 67.2% of users are running 1920x1080p displays (July 2021)
  • Total of RTX GPUs = 16.45% = minority
Also this is complete BS:
"It makes no difference if FSR can be used on an RX480"
I have a GTX 1060 and FSR is useful in games that I cannot get 60fps and those are A LOT.
I also have a GTX 1080 and by your saying I would not need FSR with that one too:
there is no card without ray tracing hardware acceleration that will be able achieve playable framerates with up sampling.
What a crock of **** you just said. Seriously how the heck is your reasoning functioning?

I absolutely need FSR for the GTX 1080 to play on High or Ultra settings in lots of games, because even that card can't handle many demading games in 1080p, and no I don't want to use Medium settings, when I can use FSR and play on High or Ultra (without RT of course)...

Don't try to BS me about the use of FSR: between my 1060, 1080 and 6700 XT, ALL OF THEM benefit from FSR, in different ways maybe, but they all do.

DLSS is a closed box and this stubborn dinosaur unless it changes massively to be open and pro-consumer and stops being pro-nvidia-premium-tax and restricted to few, it will lose in the next years vs FSR and XeSS. At least one of these two will beat it, if not both.

Please stop spouting nonsense and stop this elitist mindset, because the elites are the minority. Without the majority, which are regular low to mid tier gamers, there would be no world wide gaming, no financial success, no billions of $$$ gaming industry and no 3090s.

You cannot have that with 10k people, but you can with hundreds of millions. Those millions are regular gamers, not elitist ignorants who think they are on top of the world, but actually have no clue in what world they live in.
 
I absolutely need FSR for the GTX 1080 to play on High or Ultra settings in lots of games, because even that card can't handle many demading games in 1080p, and no I don't want to use Medium settings, when I can use FSR and play on High or Ultra (without RT of course)...

Please stop spouting nonsense and stop this elitist mindset, because the elites are the minority. Without the majority, which are regular low to mid tier gamers, there would be no world wide gaming, no financial success, no billions of $$$ gaming industry and no 3090s.


yes those playing at 1080p is no elitist. but at the same time majority of them also did not really care about lowering game setting down to medium (or even lower) if that's mean it will get them the desired frame rate. did not want to use medium setting that exactly how elitist think and feel. i dare to bet those with 1080p or lower res or with weaker hardware will not going to care much between between lowering the game setting or use FSR. the most important thing is they able to play the game they want to play.

DLSS is a closed box and this stubborn dinosaur unless it changes massively to be open and pro-consumer and stops being pro-nvidia-premium-tax and restricted to few, it will lose in the next years vs FSR and XeSS. At least one of these two will beat it, if not both.

looking how things are for the past decades it does not always work that way especially if we are talking about software. FSR probably will be the first one to go if AMD did not upgrade that tech beyond what they have today. UE5 already have TSR. those that use UE5 will definitely going to use TSR instead of using FSR. Crytek have the tendencies to develop their own solution for their Cry engine. in house physics engine, their own hardware agnostic RT solution, SVOGI to rival UE GI implementation and so on. and since they consider UE as their long time rival they most likely develop their own version of TSR as well that being tailored more for Cryengine strength. XeSS vs DLSS will be more complicated. many thought that OpenCL will going to kill CUDA in two years time and yet here were are more than a decade later CUDA still going strong and many those that have supported OpenCL in the past have decided to go with their own API. Apple with Metal and intel with oneAPI. and ultimately there is game developer. believe it or not most often they did not want to get tangled with IHV rivalry. that's why no matter how open or "free" the tech are from IVH we did not see them get used unless they have some sort of official partnership or devrel going on.
 
  • Like
Reactions: Krotow