• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Question The Ghost In The Machine

Anomaly_76

Great
Jan 14, 2024
155
13
85
Hello, all... Obviously I am new here. This is a bit long, but I tried to keep it short where I could.

I don't consider myself an expert, rather, I know a little about this and that, and in some cases, just enough to get myself in trouble. That said, I wondered if anyone had any thoughts on this.

To try to shorten a long story, after suffering with ill-informed upgrades to my first gaming PC (a pre-built), I built an entirely new system Mar of '22 with the following.

Ryzen R9 5900X
Asus Tuf B550-Plus ATX board
Crucial Ballistix CL16 BL2K16G32C16U4B (2x16, DDR4-3200)
Scythe Mugen 5 Black Edition
Asus KO RTX3060ti 8GB-OC
Corsair RMx850 ATX PSU
WD Blue SN570 1TB PCIe3 M.2
WD Black Performance 8TB HDD
Corsair 4000X RGB case
Corsair Commander Core XT with six Corsair LL120 fans


The board proved 5000-series ready as advertised and fired up the first time, but this system has driven me nuts from day one. It started with random no-video POSTs, leading to a refusal to wake from sleep mode and a few random crashes. No amount of testing utilties revealed anything.

Building a test bed with an ASRock B450M-HDV and an eBay 3600X revealed the same or worse issues. Swapping certain components with my original prebuilt didn't help. Ultimately, I duplicated the no-video POST with one of the same DIMMs. I then discovered that the Ballistix BL2K16G32C16U4B kit wasn't on the QVL list for any of the three boards involved.

After installing a Ballistix BL16G36C16U4B.M16FE1 kit from the AsRock's QVL, the 3600X test bed ran perfectly. A Patriot Viper Blackout PVB432G320C6K kit cured the 5900X's no-post video and random crashes. Apparently it was memory-training related, though I have no idea specifically why.

My old prebuilt also showed its Ballistix BL2K16G32C16U4B at different speeds. I installed a Kingston HyperX Fury HX421C14FBK4/32 (from the board's QVL) in my old prebuilt that cured most, if not all its hiccups. Hence, it would seem that despite being previously told Ballistix were most recommended for Ryzen, these CPUs and supporting boards are finicky about RAM.

In addition to the PV432G320C6K swap, I've also since recased the 5900X build with a Fractal Pop XL Air, adding an ElGato 4K60 Plus, a second WD Blue 1TB SN570 and a Pioneer BDR212-DBK Blu-Ray Writer. I originally used a 55" Sceptre U550CV-UMR for display. I got a 75" HiSense A6 after the 55" died.

However, my troubles were not 100% over. But now that you know the backstory and the hardware involved...

I am an avid player of American Truck Simulator and BeamNG.drive. I have noticed a number of less problematic issues remaining...

GPU running 99% constantly and an annoying tendency to pause in animation display in ATS, appearing as a slight pause every 20-30 scale feet of highway, and segmented refresh, especially in certain camera views.

I've somewhat curbed this by using lower resolutions (dropped from 4k to 1440p, then to 3482x1836) and DLSS, limiting FPS to 60, optimizing my display resolution for the viewing distance (approximately 6.5-7', with 3482x1836 resolution appearing best), and matching the refresh rate with my display (60 hz).

However, while the game looks and plays much better, I still notice occasional slight pauses. I have also noticed momentary dimming of colors to greyscale or close to it, which I have also noticed in things like tail lights in BeamNG as well (about the only problem I have in BeamNG).

This is what Afterburner reports at this time.

5900X: Boosts to 4950-5025 Mhz, averaging about a 65-85% load on the primary core. Thermals are optimized, peak temp of 58-62C.
RTX3060ti: Boosts to 1835-1905 Mhz, averaging about 45-65% load. Thermals are optimized, peak temp of 58-59C. Using about half of its 8GB VRAM.
About 11GB of 32GB DRAM used.

It certainly doesn't appear to be a bottleneck. I'm curious, is this possibly just limitations of the gaming engine used that have nothing to do with my configuration?

A couple weird notes that might help...

Asus listed the Patriot Kit's timings at 16-18-18-36, however, the standard DOCP came up with 20-40-40-40. I entered custom timings, but I've experiemented with the 20-40-40-40 it wants to default to, and I'm not sure whether that's a factor or not.

Also, I use a wireless Logitech MK520 keyboard / mouse combo, and I've noticed an occasional input lag whether gaming or not. I've also noticed a severe lag in Windows Explorer when accessing my 8TB HDD. Admittedly, the 8TB s over half full with video recordings, doubling as a Plex media server, but I don't think that should affect a game that's run from a completely different drive (the second SN570, especially with this hardware combo.

For those who've taken the time to read this, I appreciate it, it's a lot to digest and as I said, has been driving me nuts for nearly two years. I like my 5900X, but I think with better information, I would have spent a bit more money on upgrading the prebuilt (CyberPower, B450M Bazooka / 1700 / Tuf GTX1650S-4GB-OC) in the first place with a 3090ti and the HyperX Fury HX421C14FBK4/32 kit.
 
Last edited:
Just as an immediate thought - the PSU is a likely culprit. May be nearing or at its' designed in EOL (End of Life). Especially with a history of heavy gaming use with continual peak power demands.

Take a look in Reliability Monitor/ History and Event Viewer. What errors codes, warnings, or even informational events are being captured?

Increasing numbers of errors and varying errors make the PSU more suspect.

Reliabililty History is much more end user friendly and the timeline format can be revealing.

Event Viewer requires more time and effort. To help:

How To - How to use Windows 10 Event Viewer | Tom's Hardware Forum (tomshardware.com)

Objective being to discover specific details and information to help with the troubleshooting.
 
Just as an immediate thought - the PSU is a likely culprit. May be nearing or at its' designed in EOL (End of Life). Especially with a history of heavy gaming use with continual peak power demands.

Take a look in Reliability Monitor/ History and Event Viewer. What errors codes, warnings, or even informational events are being captured?

Increasing numbers of errors and varying errors make the PSU more suspect.

Reliabililty History is much more end user friendly and the timeline format can be revealing.

Event Viewer requires more time and effort. To help:

How To - How to use Windows 10 Event Viewer | Tom's Hardware Forum (tomshardware.com)

Objective being to discover specific details and information to help with the troubleshooting.

I appreciate the reply. I'd had my suspicions of PSU issues at some point, because at various points, anytime the system was shut down or lost power, the GPU and mobo RGB would go to rainbow puke mode until the PSU's rocker switch was turned off.

However, there was only one really hard crash initially, which appears to have been an FP connector seating issue. The no-video POST disappeared with the RAM swap. That's what makes this so confusing.

I happen to have an APC UPS. PowerChute reports about 450W peak power usage when gaming. I'm aware the 5900X is known to reach or exeed 200W TDP, the KO-RTX3060ti is listed in a few places as exceeding 200W TDP as well.

As for your thoughts on peak power demand, what I think what you're getting at is a manufacturing flaw in this specific unit, which is entirely possible, I just don't understand why it would have run for two years without failing? I should think an RMx850 should be able to withstand at least 650W continuous peak, all things equal. Unless, as I presume you're thinking, the PSU had a flaw to start with.

Armoury Crate being the PITA that it is, put up quite the fight before letting me set the 'Shutdown Effect', which cured this to a point, but after a long, hard battle with the GPU's RGB, I finally disconnected the GPU RGB.

Directly clicking the posted link does nothing. I tried to open it directly from email and it says the link is invalid? Trying to copy the link does nothing.
 
Last edited:
Update. So I was digging around and I found a few interesting threads about FPS, FPS caps, and Performance / Quality Switches on GPUs.

In my crusade to solve my problem, I have read dozens of threads about FPS. Everyone's saying, "More FPS is better!" etc... Which apparently is not entirely true in all scenarios.

Having already seen one such FPS cap thread that specifically stated 60 FPS cap for American Truck Simulator, I did exactly that, which if it helped at all, only helped slightly.

Today, I ran across one thread in which the poster stated that it was better to cap the max to ensure a more consistent FPS, and another explaining that game capture cards can actually induce an average of 200 ms lag, which works out to about 5 FPS in most cases. Wheels started turning.

And then I ran across another thread that went so far as to state that a video capture card was 100% unnecessary for simply recording gameplay on the same PC. Say what?

Apparently the ElGato 4K60 Plus I have is generally intended for patching in external sources such as gaming consoles for streaming, and I need only set up OBS Studio for recording straight from my GPU, using the .FLV format to eliminate hiccups. Ok, now I'm mad. I discovered the .FLV thing through a YT video not long after I got the ElGato because I was still experiencing hiccups.

And thinking more about it, my Tuf B550-Plus is likely to limit GPU performance to PCIe3 speeds due to some sort of bifurcation when the PCIE16x2 is populated. So I really should have set the PCIe speeds to 3 in BIOS.

I was just about to do this when I found another thread in which the poster stated that locking their FPS at just below their refresh rate 100% cured their hiccups. And after thinking about it, it made sense. Seriously, when you think about it, what's the point in generating more frames than the display can physically render in a second?

However, I was pretty sure my 75" Hisense was 75 hz. Or was it? Double checked and found it was actually... 60 hz.

So I gritted my teeth, shut down the system, carefully removed the ElGato, double-checked that the GPU switch was set to P, then started back up. I started to set PCIe to 3, but while I was in there, I did finally find an option to set Aura to stealth mode.

After booting, I set the game specific FPS cap at 57. It now seems to play and display perfectly.

I still wonder about the PSU though. It's certainly a thought.
 
"suffering with ill-informed upgrades"
This should be the title of the post

Should've gotten a Ryzen 5800x3d, it limits low lows (meaning low fps that you see as stutter)
3060 ti does have a 256 bit bus but GDDR6 memory that isn't as fast as GDDR6X like in 3070 ti+ (Wider bus and faster memory helps eliminate stutter)(Also helps when memory is bigger 12GB+)
Hisense U8/U8K Has 120hz screens

"Tuf B550-Plus is likely to limit GPU performance to PCIe3 speeds due to some sort of bifurcation when the PCIE16x2 is populated"

No, According to your manual, 1st PCIE slot remains PCIE 4.0 x16 regardless.
2nd PCIE 3.0 x16 is affected only if any of the other 3 PCIE slots are used.

Edit: I think you are getting stutter from recording. I don't know the details. Are you recording in 4k @ 60 FPS? Do you have another SSD that you are recording to?
 
Last edited:
  • Like
Reactions: Anomaly_76
"suffering with ill-informed upgrades"
This should be the title of the post

Should've gotten a Ryzen 5800x3d, it limits low lows (meaning low fps that you see as stutter)
3060 ti does have a 256 bit bus but GDDR6 memory that isn't as fast as GDDR6X like in 3070 ti+ (Wider bus and faster memory helps eliminate stutter)(Also helps when memory is bigger 12GB+)
Hisense U8/U8K Has 120hz screens

"Tuf B550-Plus is likely to limit GPU performance to PCIe3 speeds due to some sort of bifurcation when the PCIE16x2 is populated"

No, According to your manual, 1st PCIE slot remains PCIE 4.0 x16 regardless.
2nd PCIE 3.0 x16 is affected only if any of the other 3 PCIE slots are used.

I went back and double-checked the online manual. There is indeed a single * remark in that section mentioning bifurcation, but strangely, there is no item marked with a single * in that specific section. Very confusing. I'll have to look around and find my original manual to confirm that also applies to my Jan 2022 board.

Interestingly, I know someone who has both a 5800X3D and a 5900X, and can't decide which one they like better, frequently swapping them.

The 5800X3D might well have been a better choice, for most. And perhaps, for me as well. However, I use my rig as a Plex Media server as well, recording DVDs in the background while I do other things, and it seemed like such use would benefit from more cores.

I also had intentions of recording gameplay for social media / YouTube, and not being aware at the time of a higher-end CPU's limited benefit in all gaming scenarios, figured the lower I could get my overall CPU load, the better. But perhaps I overthought it.

In retrospect, I rather think I would have been better off swapping my prebuilt CyberPower 1700 / B450M Bazooka rig to a better ventilated case with a Scythe Shuriken / Fuma (or perhaps a Mugen just for the hell of it), 1000W PSU, 3090ti, and the 8TB WD Black HDD I use for movie storage. Would have probably saved myself a lot of money, irritation, and time.

Since then, I have been given a mid-life diagnosis of autism, which more or less means I can't focus enough to effectively and consistently handle such everyday tasks as video editing. I get frustrated just recording DVDs to my PC with the little bits I have to snip from the beginning and end. Yes, I am aware of Handbrake. Yes, I have it. No, it does not work on everything.

As for the Hisense U8, that would have been nice. However, I game from bed about 7 feet from the display due to space constraints and respiratory issues from even minor exertion. And being on a fixed income from disability, I couldn't justify $1600 for a U8 when the 75" A6 I got was just over $500 on sale. Maybe if I had understood a few things more clearly sooner, I could have. But it is what it is.

After all, since reinventing myself on YouTube isn't likely since life has turned out to be what happened while I made other plans, it no longer has to be butter-smooth, razor-sharp and perfect. But I would like it to be playable. :)

Perhaps later, I can do some wheeling and dealing and pick up a good 3090ti and a 5800X3D if I can get a reasonable price for the 5900X and 3060ti.
 
Last edited:
Edit: I think you are getting stutter from recording. I don't know the details. Are you recording in 4k @ 60 FPS? Do you have another SSD that you are recording to?
It's possible, I guess. I've actually wondered if recording was a factor. But it's happened even when not recording, and now I still have tearing without the capture card and not recording.

As far as SSDs go, I usually try not to write / delete / rewrite to SSDs, as I know this can cut down on their service life. However, I recently added a second M.2 that I may consider using for recording in the future. However, since I'm not likely to do very much with recording at this point other than to show friends my WTF moments, we shall see.

I was originally playing / recording 4k, attempting 60 FPS, but was willing to settle for 30 if need be. The biggest problem I was having was the recording was always crappy. Like it automatically downscaled it to 480p or something. I've gotten better-quality results from downscaling to 1080p, but that annoying tearing was a persistent issue.

At this point, ATS at a 55 FPS cap with medium SMAA / SSAO, setting weather, mirror resolution / quality to low, AF midway (8x?), with everything else high / ultra at 200% scaling, plays and looks great at 3482 x 1836. I'm wondering now if I dare try 4k, since my peak GPU load is about 58-62%, but maybe I should just leave it alone.
 
Last edited:
Annnnd, now I REALLY feel like an idiot. Went back to double-check something. I was wondering why this was a johnny-come-lately problem out of nowhere.

Well, my 75" (a 60 hz I thought was a 75 hz) was supposed to be an upgrade from a 55" that died around July. Welp turns out that 55" was 120hz. Seems like this issue started about that time. So that's why I didn't have this problem before. Mystery solved. Sounds like @WarWolverineWarrior is on to something with the U8 suggestion.

What sucks is now I have to either learn to live with minor tearing at 58-60 FPS (not liking that idea), or figuring out a way to get a 120hz model. Fortunately, I may have some options that won't sting too much.
 
Last edited:
So, I think I've found a happy medium that I can live with until the right deal pops up for a 120 hz display. 53 fps cap seems to be pretty stable and smooth at 3482 x 1836 with DLSS on, 200% scaling, high texture quality, pretty much everything else low to medium settings. ATS looks and plays as decent as I I think I'm going to get it on a 60 hz display.

Interesting phenomena though. Experimenting with settings, I have artifacting with higher detail and scaling at 1440p, but at 4k with lower detail, I don't.

I never saw artifacting with this card when gaming before, only in Paint 3D with Armoury Crate running. And why would it artifact at lower resolution with higher detail and not at higher resolution with lower detail?