Question RTX3080, 5800x, extremely terrible performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Dec 25, 2022
14
0
10
On my 1440p monitor:
Apex Legends: ranges from 55-80 fps on the ground depending on whether im in a fight or looking at the sky, barely 30 fps when in the ship flying over the map and changing graphic settings doesn't change the fps, low or ultra (compared to avg 180-200 fps benchmarks)
Cyberpunk: 40 fps on ultra settings, barely 30 with rtx on.

Since I moved home briefly for the holidays and couldn’t take my 1440p monitor with me, I’m stuck with a 1080p one for now.

On my 1080p monitor:
Valorant: 110 frames avg max settings
Elden Ring: Rarely a shaky 60fps, drops down to a constant 25fps in windy areas with leaves, changing graphic setting doesnt change fps
Cyberpunk: same thing as the 1440p monitor, if not a little worse
Apex Legends: no difference from 1440p
Staxel, which is literally a blockish minecraft looking game: barely 35fps

For all of these games except for Cyberpunk, changing settings from ultra to low doesn’t affect the fps.

Specs:
Ryzen 7 5800x
RTX 3080 Founders Edition
Asus Rog Strix B550-A
Corsair RMx 850W Gold
Samsung 970 Evo 1TB SSD
Corsair Vengeance 4x8gb DDR4 3200MHz

For games like Elden Ring and Cyberpunk, gpu utilization is always higher than 90% while cpu utilization is hovering around 40%. I haven’t checked utilizations for “easier to run” games like Valorant. Cpu temps don’t go higher than 70C and gpu temps don’t go higher than 62C. UserBenchmark tells me that my gpu is performing at the 7th percentile, sometimes 8th percentile, while everything else is close to average.

I’ve tried enabling rebar, turning on docp, changing the power management settings in both windows and nvidia control panel, using DDU to clean old drivers, and made sure I’ve got the latest graphic drivers. I’m not daisy chaining the cables to the gpu either. Today I swapped out a new psu to see if that was the problem as sometimes my peripherals, especially the keyboard, would get randomly disconnected despite the cable being tested and fine. (This was a problem in my old pc that was fixed with a new psu.) None of these changes made a difference. The whole thing blue screened a couple days ago.

This is really disheartening because its my first build and yet it’s performing worse than my old dell prebuilt with a 1660ti shoved into it, which could run Elden Ring and Cyberpunk and high and medium-high settings respectively at a solid 60fps. Is there anything I can do? Is it just a faulty gpu that I happened to get? Any help would be appreciated.
 
Dec 25, 2022
14
0
10
I'm stuck with a 1080p monitor for now since I moved back to my parent's place for the holidays, but when I get back I'll be able to test all this at 1440p. I just figured it was really weird how my pc wasn't getting close to the same fps in benchmarks with the same specs at 1080p.
 
Dec 25, 2022
14
0
10
now, this is old, but ive finally discovered what i was doing wrong. there were two pcie slots, and i had put the gpu in the slot furthest away from the cpu. I was poking around in my bios and they both read PCIEX16, and i thought they were the same, but apparently not, because there was a paragraph in bios that told me for single gpus, use the closest slot. Nothing left to try, so i swapped the gpu position and the fps is WORLDS better.
 

Karadjgne

Titan
Ambassador
PcieX16 is the slot Type, as in electrically it supports upto pciex16 capable cards. Don't confuse that with communication speeds/bandwidth which can be x1, x2, x4, x8 or x16 depending on the motherboard and other slots. Most consumer boards come in 2 varieties if they have 2x slots, they either run x8/x8 or x16/x4 if the second slot is populated, and if other x1 or x4 slots are populated that can be reduced to x2 or even x1, with the x8/x8 generally disabling the shared slots.

The communication/bandwidth is what changes, not the actual slot type, so populating the second slot, you were most likely seeing x4, possibly x2, depending on the rest of the storage/pcie options instead of the full x16 bandwidth the card is capable of.

Top slot uses pcie lanes from the cpu directly, which is why it can use x16, the secondary slot uses pcie lanes from the motherboard chipset bus, which have to be shared with storage lanes.