[SOLVED] GPU graphic quality

Apr 6, 2020
4
0
10
Does the quality of the image vary from GPU to GPU? For example does a Titan RTX/2080 ti offer better texture quality than a 2080 super, a 2070 super or a 2060 super at 1080p, 1440p and 4k?
Or is it just the performance that is affects like the fps?

i was told by a friend that texture quality depends more on the game itself and the resolution of the monitor but i thought id ask here.
 
Last edited:
Solution
Texture quality is generally determined by the game engine the developers use and what they do with the game engine and artistic design.
Now that I've got a bit more time (and a proper keyboard), I'll explain my asterisk.

AMD and Nvidia each have some proprietary....features that can be included by game developers (or moreso AMD/Nvidia shoulder the workload and cost of incorporating those features into the game for the developer as a marketing tactic) that can make some things look different on AMD or Nvidia cards respectively. Most of the time, because these features are proprietary, they only get included in a few games (as many as AMD/Nvidia want to pay for). However, some features "stick" and become open standard that get widely...
Texture quality is generally determined by the game engine the developers use and what they do with the game engine and artistic design.
Now that I've got a bit more time (and a proper keyboard), I'll explain my asterisk.

AMD and Nvidia each have some proprietary....features that can be included by game developers (or moreso AMD/Nvidia shoulder the workload and cost of incorporating those features into the game for the developer as a marketing tactic) that can make some things look different on AMD or Nvidia cards respectively. Most of the time, because these features are proprietary, they only get included in a few games (as many as AMD/Nvidia want to pay for). However, some features "stick" and become open standard that get widely adopted.

The most common one these days is (light) ray tracing which Nvidia has incorporated special hardware into their RTX20xx GPUs to handle (the new AMD GPUs late this year will also have this). Ray tracing generally makes lighting/shadows/reflections more realistic looking. Because AMD is committed to also supporting Ray Tracing, this feature is on track to become widely adopted (eventually). The problem with it so far is the massive performance (FPS) hit that comes with enabling ray tracing in games. Tackling that problem will come with time. Nvidia's nex-gen GPUs coming this fall will purportedly have twice the ray tracing capabilities compared to current gen RTX20xx GPUs, which would seem to roughly match what AMD is about to release.
I would argue that currently, ray tracing is being exaggerated to make it stand out visually. You'll notice that they globally brighten scenes when they show ray tracing, and make all surfaces unrealistically shiny to enhance the effect.

Less commonly found adaptations of features:
  • AMD had/has a feature called TressFX in the recent tomb raider games that makes Laura Crofts hair look very realistic.
  • Nvidia has used their PhysX feature to create particle clouds for explosions in games.
  • Etc etc.
 
  • Like
Reactions: HurricaneLo
Solution
Apr 6, 2020
4
0
10
Texture quality is generally determined by the game engine the developers use and what they do with the game engine and artistic design.
Now that I've got a bit more time (and a proper keyboard), I'll explain my asterisk.

AMD and Nvidia each have some proprietary....features that can be included by game developers (or moreso AMD/Nvidia shoulder the workload and cost of incorporating those features into the game for the developer as a marketing tactic) that can make some things look different on AMD or Nvidia cards respectively. Most of the time, because these features are proprietary, they only get included in a few games (as many as AMD/Nvidia want to pay for). However, some features "stick" and become open standard that get widely adopted.

The most common one these days is (light) ray tracing which Nvidia has incorporated special hardware into their RTX20xx GPUs to handle (the new AMD GPUs late this year will also have this). Ray tracing generally makes lighting/shadows/reflections more realistic looking. Because AMD is committed to also supporting Ray Tracing, this feature is on track to become widely adopted (eventually). The problem with it so far is the massive performance (FPS) hit that comes with enabling ray tracing in games. Tackling that problem will come with time. Nvidia's nex-gen GPUs coming this fall will purportedly have twice the ray tracing capabilities compared to current gen RTX20xx GPUs, which would seem to roughly match what AMD is about to release.
I would argue that currently, ray tracing is being exaggerated to make it stand out visually. You'll notice that they globally brighten scenes when they show ray tracing, and make all surfaces unrealistically shiny to enhance the effect.

Less commonly found adaptations of features:
  • AMD had/has a feature called TressFX in the recent tomb raider games that makes Laura Crofts hair look very realistic.
  • Nvidia has used their PhysX feature to create particle clouds for explosions in games.
  • Etc etc.
Oh my.
Thanks a lot for this incredibly insightful information. 👍