So I was forced into a GPU upgrade as one of my two GTX 1080s took a long walk of a short pier. I was very upset as my performance was good enough with my old X79 platform @4K when SLI was supported and I had heard what a "huge" bottleneck older CPU's (even newish AMD 2000 series) are on the RTX 2080 Ti and non Ti. While I had my doubts from experience as a 4K with HDR gamer that such rumors were true and/or applied to me at such a high resolution, I was still hesitant to make the purchase all the same. I mean let's face it the pricing of RTX cards is a little insane. After picking up one of MSI's newest Seahawk variants the RTX 2080 Ti EK (non x...ie a new SKU with a standard 300A stock OC chip 1350mhz base/1635mhz boost) and chucking it my rig, my fears quickly evaporated.
I was very unhappy to see MSI cripples the newest Seahawk variant by imposing a power limit of only 110% on a custom water loop cooled card but seeing my old x79 platform with an OC'd Xeon E5-1680 V2 (see signture for full rig details) did indeed still have very solid gaming chops when gaming at 4K a smile still crept onto my face dispite my displeasure with MSI. Comparing my benchmark runs with other websites I was within a frame or two per second with even the fastest CPU's on the market @4K showing just how GPU bottlenecked the resolution still is today. Now when I dialed things back to 1440P out of curiosity, my GPU's performance numbers tended to fall somewhere between an AMD Ryzen 1700X and 1800X a commendable feat in itself but the first sign of it's older CPU architecture holding back my GPU even if just a little. Now there are some caveats to this whole scenerio. The biggest one being I do have to disable spectre and meltdown updates to achieve this performance using Inspectre. If i don't, my numbers drop by around 10-15%. Now I can also just disable hyper-threading but my numbers where still a little low in the range of about 3%. Not a huge number but one worth noting all the same. Inspectre is easy to use and while it does require a restart to kick in. I generally restart before playing games if i have been using my PC for other things anyways to ensure my rig is running at peak performance. Nothing worse then a zombie task eating up CPU cycles or a memory leak hogging your ram!
So what does all this mean? Well most importantly it means I have a viable CPU platform when gaming for another couple of years but it also tells a story about gaming on PC in general. People frequently think higher resolutions require more CPU horse power to run. This simply is not true. The higher the resolution the less work your CPU does because your GPU can't keep up. For this reason when someone tells me they need to upgrade their CPU for better frame rates because of very high resolutions, I tell them to check there CPU and GPU usage numbers. Something sadly many new or less knowledgeable PC gamers so frequently do not do before buying parts they "think" they need. If you see your CPU usage at < 85% (per thread and as whole) with >95% GPU usage...getting a new CPU will likely not change much in your gaming experience. Now you see near 100% CPU usage in a single thread or worse on your CPU as a whole now it really is time to talk about a CPU upgrade. Of course much of this only applies to 60-80hz gamer's. Once you start clearing the 90FPS mark into high refresh rate gaming your CPU and GPU start to get on more even footing. A place where both need to be the best to achieve the highest frame rates possible. At the end of the day though there are a lot of folks out there in the same boat as me gaming on older CPU platforms but wanting upgrade their GPU and displays. Gaming on PC for 20+ years I was fairly certain how my GPU purchase would turn out. I do, however, remember a time when such gaming knowledge was new to me and gray hair was something only "old" people had. I hope my latest upgrade experience can help others make informed decisions when purchasing new hardware or when not to ; )
I was very unhappy to see MSI cripples the newest Seahawk variant by imposing a power limit of only 110% on a custom water loop cooled card but seeing my old x79 platform with an OC'd Xeon E5-1680 V2 (see signture for full rig details) did indeed still have very solid gaming chops when gaming at 4K a smile still crept onto my face dispite my displeasure with MSI. Comparing my benchmark runs with other websites I was within a frame or two per second with even the fastest CPU's on the market @4K showing just how GPU bottlenecked the resolution still is today. Now when I dialed things back to 1440P out of curiosity, my GPU's performance numbers tended to fall somewhere between an AMD Ryzen 1700X and 1800X a commendable feat in itself but the first sign of it's older CPU architecture holding back my GPU even if just a little. Now there are some caveats to this whole scenerio. The biggest one being I do have to disable spectre and meltdown updates to achieve this performance using Inspectre. If i don't, my numbers drop by around 10-15%. Now I can also just disable hyper-threading but my numbers where still a little low in the range of about 3%. Not a huge number but one worth noting all the same. Inspectre is easy to use and while it does require a restart to kick in. I generally restart before playing games if i have been using my PC for other things anyways to ensure my rig is running at peak performance. Nothing worse then a zombie task eating up CPU cycles or a memory leak hogging your ram!
So what does all this mean? Well most importantly it means I have a viable CPU platform when gaming for another couple of years but it also tells a story about gaming on PC in general. People frequently think higher resolutions require more CPU horse power to run. This simply is not true. The higher the resolution the less work your CPU does because your GPU can't keep up. For this reason when someone tells me they need to upgrade their CPU for better frame rates because of very high resolutions, I tell them to check there CPU and GPU usage numbers. Something sadly many new or less knowledgeable PC gamers so frequently do not do before buying parts they "think" they need. If you see your CPU usage at < 85% (per thread and as whole) with >95% GPU usage...getting a new CPU will likely not change much in your gaming experience. Now you see near 100% CPU usage in a single thread or worse on your CPU as a whole now it really is time to talk about a CPU upgrade. Of course much of this only applies to 60-80hz gamer's. Once you start clearing the 90FPS mark into high refresh rate gaming your CPU and GPU start to get on more even footing. A place where both need to be the best to achieve the highest frame rates possible. At the end of the day though there are a lot of folks out there in the same boat as me gaming on older CPU platforms but wanting upgrade their GPU and displays. Gaming on PC for 20+ years I was fairly certain how my GPU purchase would turn out. I do, however, remember a time when such gaming knowledge was new to me and gray hair was something only "old" people had. I hope my latest upgrade experience can help others make informed decisions when purchasing new hardware or when not to ; )
Last edited: