MOBO: MSI B450 Tomahawk
CPU: ryzen 7 2700X
RAM: 2x16GB Corsair 3200 Mhz
Storage: 512GB SSD, 4TB Seagate
PSU: 850GQ EVGA
GPU: MSI RX 6600 XT
My question is that now that I have upgraded to a Radeon 6600 XT, literally none of my games are detecting this GPU and are auto detecting for the lowest settings. Why is this happening? I even feel like when I customize for highest settings I get a poor wuality picture compared to the old GTX 970 I was using, which makes no sense. This gpu should be a powerhouse and instead its performing like some DDR3. In the Radeon Software app that manages your card settings, it says "marginal" performance from games that are 12+ years old. Idk what to do.
CPU: ryzen 7 2700X
RAM: 2x16GB Corsair 3200 Mhz
Storage: 512GB SSD, 4TB Seagate
PSU: 850GQ EVGA
GPU: MSI RX 6600 XT
My question is that now that I have upgraded to a Radeon 6600 XT, literally none of my games are detecting this GPU and are auto detecting for the lowest settings. Why is this happening? I even feel like when I customize for highest settings I get a poor wuality picture compared to the old GTX 970 I was using, which makes no sense. This gpu should be a powerhouse and instead its performing like some DDR3. In the Radeon Software app that manages your card settings, it says "marginal" performance from games that are 12+ years old. Idk what to do.