Texture quality is generally determined by the game engine the developers use and what they do with the game engine and artistic design.
Now that I've got a bit more time (and a proper keyboard), I'll explain my asterisk.
AMD and Nvidia each have some proprietary....features that can be included by game developers (or moreso AMD/Nvidia shoulder the workload and cost of incorporating those features into the game for the developer as a marketing tactic) that can make some things look different on AMD or Nvidia cards respectively. Most of the time, because these features are proprietary, they only get included in a few games (as many as AMD/Nvidia want to pay for). However, some features "stick" and become open standard that get widely adopted.
The most common one these days is (light) ray tracing which Nvidia has incorporated special hardware into their RTX20xx GPUs to handle (the new AMD GPUs late this year will also have this). Ray tracing generally makes lighting/shadows/reflections more realistic looking. Because AMD is committed to also supporting Ray Tracing, this feature is on track to become widely adopted (eventually). The problem with it so far is the massive performance (FPS) hit that comes with enabling ray tracing in games. Tackling that problem will come with time. Nvidia's nex-gen GPUs coming this fall will purportedly have twice the ray tracing capabilities compared to current gen RTX20xx GPUs, which would seem to roughly match what AMD is about to release.
I would argue that currently, ray tracing is being exaggerated to make it stand out visually. You'll notice that they globally brighten scenes when they show ray tracing, and make all surfaces unrealistically shiny to enhance the effect.
Less commonly found adaptations of features:
- AMD had/has a feature called TressFX in the recent tomb raider games that makes Laura Crofts hair look very realistic.
- Nvidia has used their PhysX feature to create particle clouds for explosions in games.
- Etc etc.