GPU Performance Hierarchy 2019: Video Cards Ranked

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,584
1,708
9,070
1
Again, thanks for the informative reply!

I have only used Nvidia GPUs in desktops, not that I ever had a reason not to get AMD ... just in almost 20 years I have only had two GPUs of my own I guess.

Had Windows 95/98/XP with the family ... then got my own newer XP in 2005 from my family as a present ... so that had GeForce 8400 GS which I then upgraded to GeForce GTX 650 Ti Boost ... which is now in the system I tried building a few years ago when the GPU shortage came at the same time I was looking for a GPU.

I agree, if there is a useful feature to use, then of course use it. I suppose I would have to try and see performance gain or loss for myself to agree or disagree one way or another.

This probably isn't the right place to ask, but is there some fast wireless display software or adapter that keeps performance from a PC to a TV or a monitor? Or if certain graphics cards work better doing that than others? I assume connection via HDMI will always be the best performance just like connection via ethernet is the best performance vs wifi. I have been using TVs as my PC display for years now
WiDi (Wireless Display) was a thing I remember testing over a decade ago on some laptops. That was eventually discontinued and superseded by Miracast. There's now support for up to 4K streaming, but I don't know how it does with latency. Certainly cabled connections will be better on the latency aspect, though for things like video (not gaming) the latency isn't a major issue.
 
Dec 7, 2022
3
1
15
0
Is it possible to share the exact settings used for Red Dead Redemption 2 at 1080p medium and ultra? It's annoying that the game doesn't have universal quality presets.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,584
1,708
9,070
1
Is it possible to share the exact settings used for Red Dead Redemption 2 at 1080p medium and ultra? It's annoying that the game doesn't have universal quality presets.
Sure. I think I wrote this somewhere, but you're right: the lack of presets sucks! (It bases your "preset" preference on how much VRAM you have, not on potential GPU speed.)

  1. Set the preset to the minimum value, then set/confirm everything to minimum; unlock the advanced settings, and also set all of those to the minimum values.
  2. For 1080p medium, now go into the basic settings and put everything up one notch (mostly "medium" but also 2xAF). Do not enable DLSS, MSAA, or FSR.
  3. For the "ultra" settings, go back into the basic settings and turn everything to maximum (high/ultra/16xAF). Do not enable DLSS, MSAA, or FSR.
Those are my current settings, but I'm going to change things for the upcoming reviews. Basically, instead of the above, we'll start by setting everything to "medium/on" including the advanced options for medium, and for ultra everything will be at maximum. I'll still leave DLSS/FSR/MSAA settings off, for apples-to-apples, though I'll show FSR2 or DLSS on initial GPU reviews if the game supports either of those (but only on the reviewed card).
 
Reactions: HoveringAbove
what most reviews missing is how much watts gpu draws at lets say 1080p 60fps...
most reviews just bench its max potential, not everybody runs theirs GPUs at max all time

from what i can say i switched from 1070ti which was eating around 200watts in lots of 1080p 60fps at ultra games...now im on rx 6800 and wattage droped to around 60 watts, even at 1440p at ultra (60fps) im still under 100 watts

why i went for rx 6800...well i wanted better gpu..rtx is here in eu a bit overpriced and amd isnt that far behind while being cheaper....before purchasing i wanted 6750 xt, but looking through several youtube comparisons 6750xt vs 6800 vs 6800xt...6750xt was eating more wattage as 6800 while producing less fps, 6800 xt had few fps higher, but wattage was also higher (+10% fps/+30% power)..so i went with a winner lol and indeed gpu runs at very low watts, undervolted its even better :)
 
Dec 7, 2022
3
1
15
0
Sure. I think I wrote this somewhere, but you're right: the lack of presets sucks! (It bases your "preset" preference on how much VRAM you have, not on potential GPU speed.)

  1. Set the preset to the minimum value, then set/confirm everything to minimum; unlock the advanced settings, and also set all of those to the minimum values.
  2. For 1080p medium, now go into the basic settings and put everything up one notch (mostly "medium" but also 2xAF). Do not enable DLSS, MSAA, or FSR.
  3. For the "ultra" settings, go back into the basic settings and turn everything to maximum (high/ultra/16xAF). Do not enable DLSS, MSAA, or FSR.
Those are my current settings, but I'm going to change things for the upcoming reviews. Basically, instead of the above, we'll start by setting everything to "medium/on" including the advanced options for medium, and for ultra everything will be at maximum. I'll still leave DLSS/FSR/MSAA settings off, for apples-to-apples, though I'll show FSR2 or DLSS on initial GPU reviews if the game supports either of those (but only on the reviewed card).
Thanks a lot!
Btw does TAA stay off too?
 

King_V

Illustrious
Ambassador
Hey, @JarredWaltonGPU . . . somehow this one popped back into my head (must have stumbled across a reference to it somewhere recently), but, by any chance, were you ever able to get a hold of an RX 5300.

Yes, of course I mean on purpose, why do you ask? :LOL:

I realize it's got that whole 3GB VRAM issue and all, but, eh, what can I say? I can't shake my obsession with the budget boards.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,584
1,708
9,070
1
Hey, @JarredWaltonGPU . . . somehow this one popped back into my head (must have stumbled across a reference to it somewhere recently), but, by any chance, were you ever able to get a hold of an RX 5300.

Yes, of course I mean on purpose, why do you ask? :LOL:

I realize it's got that whole 3GB VRAM issue and all, but, eh, what can I say? I can't shake my obsession with the budget boards.
Sadly (not really!), I do not have an RX 5300. I have a bunch of other old cards, but nothing like the 5300. There's a used on on eBay for $70... but I don't actually want the card. LOL
 

King_V

Illustrious
Ambassador
Sadly (not really!), I do not have an RX 5300. I have a bunch of other old cards, but nothing like the 5300. There's a used on on eBay for $70... but I don't actually want the card. LOL
Yeah, I saw that one. Did a bit of digging and I've read hints for performance ranging from "it's about where a 1650 is" to "edging out a 1650Ti" (??)

The curiosity is actually making me think I might grab one of those for $70. Of course, the "used" but also "more than 10 available" would have me asking where they came from first. Not that I can imagine the RX 5300 being used for mining...

EDIT: a couple of grammar/clarification fixes
 
Last edited:

King_V

Illustrious
Ambassador
@JarredWaltonGPU - I don't know whether to be proud or ashamed, but I actually, after a bit of hemming and hawing, ordered one. It arrived today.

Yep, I'm now the owner of an RX 5300. I suppose I'll play with it soon, but, other things do keep getting in the way. After all, I'm also the guy who did this, so, whatever I try, hopefully I won't wait nearly that long!
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,584
1,708
9,070
1
@JarredWaltonGPU - I don't know whether to be proud or ashamed, but I actually, after a bit of hemming and hawing, ordered one. It arrived today.

Yep, I'm now the owner of an RX 5300. I suppose I'll play with it soon, but, other things do keep getting in the way. After all, I'm also the guy who did this, so, whatever I try, hopefully I won't wait nearly that long!
By "play with it soon," do you mean in less than eight months? Or more than eight months? Enquiring minds want to know! :ROFLMAO:
 

ASK THE COMMUNITY