Hi, I recently bought myself an RTX 3070. Last year, I built a new PC but ported over my previous GPU (GTX 980) since I couldn't afford a new GPU at the time. However, I decided to upgrade this year instead.
It performs pretty well on all games with ultra settings, even RT toggled on to max. However, with Cyberpunk 2077, the Witcher 3 next gen upgrade and Red Dead Redemption 2, I noticed that the card almost seems to be underperforming.
In Cyberpunk, having RT enabled drops my framerate into the 30s and 40s. Same with the Witcher 3 next gen. When I turn RT off, the framerate still isn't as good as I was expecting, typically hovering around the 50s with occasional dips.
In RDR2, I was under the impression (judging based on videos of the game in 1080p with this hardware) that it would run flawlessly on ultra settings, especially since I have a 900p monitor. However, I had to turn some stuff down and toggle DLSS on just to keep things around 60FPS.
This is confusing and somewhat disappointing to me. These are totally playable framerates for me (I used to play games in 20FPS on a laptop in the days before I had a rig) but I'm here to ask if this is normal or not.
DLSS helped in RDR2 but it didn't make any difference in Cyberpunk or the Witcher 3 remaster. Some info I found suggested CPU bottlenecking but I didn't think that'd be a problem with my CPU. Nonetheless, those two games never seemed to bring my GPU usage anywhere near 100%... but neither did my CPU, either, which is why I'm rather confused.
Is there something I can do to fix this, or do I just have to accept the disappointing performance? Considering I spent over $500 for a new GPU, I was hoping for better.
Thanks in advance for any help and advice. If I need to provide more info, please let me know.
SPECS:
It performs pretty well on all games with ultra settings, even RT toggled on to max. However, with Cyberpunk 2077, the Witcher 3 next gen upgrade and Red Dead Redemption 2, I noticed that the card almost seems to be underperforming.
In Cyberpunk, having RT enabled drops my framerate into the 30s and 40s. Same with the Witcher 3 next gen. When I turn RT off, the framerate still isn't as good as I was expecting, typically hovering around the 50s with occasional dips.
In RDR2, I was under the impression (judging based on videos of the game in 1080p with this hardware) that it would run flawlessly on ultra settings, especially since I have a 900p monitor. However, I had to turn some stuff down and toggle DLSS on just to keep things around 60FPS.
This is confusing and somewhat disappointing to me. These are totally playable framerates for me (I used to play games in 20FPS on a laptop in the days before I had a rig) but I'm here to ask if this is normal or not.
DLSS helped in RDR2 but it didn't make any difference in Cyberpunk or the Witcher 3 remaster. Some info I found suggested CPU bottlenecking but I didn't think that'd be a problem with my CPU. Nonetheless, those two games never seemed to bring my GPU usage anywhere near 100%... but neither did my CPU, either, which is why I'm rather confused.
Is there something I can do to fix this, or do I just have to accept the disappointing performance? Considering I spent over $500 for a new GPU, I was hoping for better.
Thanks in advance for any help and advice. If I need to provide more info, please let me know.
SPECS:
- OS: Windows 10 Pro 64-bit (22H2)
- CPU: AMD Ryzen 5 5600g
- RAM: 16gb (2x8gb) DDR4
- Mobo: ASUS Prime B450M-A II
- GPU: Gigabyte GeForce RTX 3070 Gaming OC 8G (REV2.0)
- Storage: 2x Kingston SATA SSDs (480gb), 1x Seagate 7200rpm HDD (2tb)