KnightsCross

Distinguished
Feb 7, 2015
92
2
18,535
Specs
Motherboard: B450 Tomahawk
CPU: Ryzen 7 2700X
PSU: EVGA 850 GQ
RAM: 32GB 3200Mhz Corsair Vengeance RGB Pro
Cooler: Corsair 240mm RGB
Drives: WD 512 GB SSD, 4TB Seagate Storage
GPU: MSI Mech 2X Radeon 6600 XT

So I previously posted about my system not identifying new hardware, the new 6600xt gpu. I finally got the system to identify it, and now I face a new problem for some reason. Everytime I install a new game or try to tweak some settings, the game auto detects the 6600xt and sets for low settings. I mean like 12 year old games do this. It didn't do it with the 970 gtx that I owned so why the heck would it do it with a 6600xt? I can't really tell if it's running worse than it should be, but Radeon Software app thinks it is. For example, when I run Company of Heroes 2 or Fallout New Vegas, older games I often test stuff out on, Radeon app says that the card is only providing "marginal" performance. Any ideas? I have pics to upload to show settings within the app.

Edit: I tried to use imgur to link the pics to my post. I tried to embed and everything else. I just kept getting error message to contact the administrator. Apparently I am stupid. Anyone care to share how to share pics in a forum post on toms?
 
Solution
Try to run a benchmark like 3DMark and see what the result is there? Anything around average should be perfect, higher is usually only achived through tuning and OC. 3DMark especially is really good for comparing performance. I can't tell you what FPS you should have jn those, though. That's why I suggest an actual benchmark. They aren't perfect, but what is.
I see some older games set the quality for the lowest settings, despite the card (an RTX 2070 Super) can run the game and ask for 5ths. The game probably doesn't know what video card it is or whatever test it's performing doesn't really work on new cards, so it defaults to the lowest setting.

I'd ignore whatever the games are trying to do and manually set the settings.
 

KnightsCross

Distinguished
Feb 7, 2015
92
2
18,535
I see some older games set the quality for the lowest settings, despite the card (an RTX 2070 Super) can run the game and ask for 5ths. The game probably doesn't know what video card it is or whatever test it's performing doesn't really work on new cards, so it defaults to the lowest setting.

I'd ignore whatever the games are trying to do and manually set the settings.
I intended to do that, but the radeon software program telling me its only performing marginally has me spooked. Did I really spend 600$ on this for, well, THIS?
 

KnightsCross

Distinguished
Feb 7, 2015
92
2
18,535
I honestly think that its under performing, tracking the fps on my 144hz monitor with the app set to run as many frames as possible, it never gets about 100-110. Image quality doesnt seem as good as the 970.
 

KyaraM

Admirable
Try to run a benchmark like 3DMark and see what the result is there? Anything around average should be perfect, higher is usually only achived through tuning and OC. 3DMark especially is really good for comparing performance. I can't tell you what FPS you should have jn those, though. That's why I suggest an actual benchmark. They aren't perfect, but what is.
 
Solution