Ok, so, informally, I'm calling my other thread, here, with the RX 5600 non-XT OEM, Chapter 1 of this series, even though I bought the RX 5300 (used) some months earlier (late February 2023). Will there be a Chapter 3? No idea. Probably, if I find something weird enough that I buy it.
So, the RX 5300. Somewhat odd, definitely hampered by the 3GB of VRAM. But, judging by the GPU-Z specs, it's 96-bit, with a bandwidth of 168GBps, so, like its bigger brothers, it's sporting 14Gbps VRAM. Kinda neat, wouldn't have expected that.
I stick for now with FurMark, and Borderlands 3, since the latter is the only game I have that's part of Jarred's raster-only test suite, since that makes it easier for me to cross-reference performance tests here. That BL3 is an AMD-optimized title probably works to this little card's advantage.
Side note: so, uh, @JarredWaltonGPU, remember that question you asked me back here? Well, I was sure it would be less than eight months, but, turns out, it became MORE than eight months. Nine months, it seems. I hang my head in shame!
So, my quick and sloppy benchmarking. FurMark, I rely on the info that FurMark itself displays, and for BL3, I rely on its overall FPS calculation, and the Adrenaline Overlay to note max temperatures and power draw.
Furmark 1080p preset: 3855 points / 64fps
Borderlands 3, 1080p:
Definitely better than I expected. Especially considering that, now, as I'm typing this, I realize the BL3 tests in the 1650 GDDR6 review used Ultra rather than Badass settings. I'll probably swing back with the performance of Ultra rather than Badass just for more of an apples-to-apples comparison.
Because if there's anything I know . . it's that there are at least three people in the world besides myself who care about these low-end cards 🤣
So, the RX 5300. Somewhat odd, definitely hampered by the 3GB of VRAM. But, judging by the GPU-Z specs, it's 96-bit, with a bandwidth of 168GBps, so, like its bigger brothers, it's sporting 14Gbps VRAM. Kinda neat, wouldn't have expected that.
I stick for now with FurMark, and Borderlands 3, since the latter is the only game I have that's part of Jarred's raster-only test suite, since that makes it easier for me to cross-reference performance tests here. That BL3 is an AMD-optimized title probably works to this little card's advantage.
Side note: so, uh, @JarredWaltonGPU, remember that question you asked me back here? Well, I was sure it would be less than eight months, but, turns out, it became MORE than eight months. Nine months, it seems. I hang my head in shame!
So, my quick and sloppy benchmarking. FurMark, I rely on the info that FurMark itself displays, and for BL3, I rely on its overall FPS calculation, and the Adrenaline Overlay to note max temperatures and power draw.
Furmark 1080p preset: 3855 points / 64fps
- Peak power draw: 83W
- Peak temperature: (I forgot to note this, guess I'm gonna check again)
Borderlands 3, 1080p:
- Peak power draw: 69W
- Peak temperature: 72°C
- Medium settings, DX12
- First run: 57.5 fps
- Second run: 61.1 fps
- Third run: 64.4 fps
- Fourth run: 64.1 fps
- Badass settings, DX12
- First run: 36.6 fps
- Second run: 35.7 fps
- Third run: 36.0 fps
- Fourth run: 33.6 fps
Definitely better than I expected. Especially considering that, now, as I'm typing this, I realize the BL3 tests in the 1650 GDDR6 review used Ultra rather than Badass settings. I'll probably swing back with the performance of Ultra rather than Badass just for more of an apples-to-apples comparison.
Because if there's anything I know . . it's that there are at least three people in the world besides myself who care about these low-end cards 🤣
Last edited: