• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Recommended CPU for RTX 2080 (NON ti)

blazs91

Reputable
Apr 25, 2014
44
0
4,530
Hi everyone,

I'm buying the new Nvidia RTX 2080 card tomorrow. I have a 5 years old laptop, so an upgrade is really opportune right now. However, I'm a little bit concerned about the CPU. Initially, I was eyeballing with the i5-8600, but after conducting some more thinking, I'm now considering a more powerful one. The i7-8700 (NON-k) seems appealing. Although, for that money, I could grab a Ryzen 2700x. I'm gonna use my rig for 3D modeling (Maya, Zbrush), texturing (Substance Painter), Rendering (Keyshot, Unreal Engine, Marmoset) and occasionally for gaming on 4K res.
So, I'm just interested in your opinion, if that AMD or i7 Intel was a great fit for the RTX 2080 (NON ti) or not. I couldn't have found any useful info on the internet (I guess the card is too new), so every bit of info is highly appreciated! :)

Thank you in advance! :)
 
Solution

Gaming at 4k resolutions neither would be a bottleneck. Gaming at 1080p resolutions the i7 8700 would be less of a bottleneck.

If you're gaming at 4k you won't notice a difference between an i7 8700k and a Ryzen 2700x. The reason for this is so much load is placed on the graphics card at 4k, therefore the graphics card becomes a natural bottleneck. However if you were to be playing on something crazy like a 240hz monitor, at 1080p resolution, the intel i7 8700k will give you higher FPS due to it single core performance and would be advisable in that instance.
 
Hi SgtScream,

Thank you for your quick response. What you wrote is totally true, but my main question here, if any of those CPUs (i7-8700 or Ryzen 2700x) would be a bottleneck for the RTX 2080 or not? :)
 


Thank you! :)
 
https://www.techradar.com/amp/news/benchmarks-confirm-nvidia-geforce-rtx-2080-ti-and-rtx-2080-are-for-top-end-gaming-pcs-only

This article says, "In many cases, the GeForce RTX 2080 is so powerful that the CPU is the bottleneck, according to Nvidia. “To prevent this from occurring, we highly recommend you conduct your testing on a 4K display or higher with HDR. We also suggest you use maximum graphics settings and high AA levels in most games. The Turing GPU architecture improves High Dynamic Range (HDR) gaming performance and input latency using hardware-based compositing, tone mapping, and chroma filtering with HDR surfaces.”
 


Hi,
Thank you for your response. I've already read that article somewhere, but as far as I can remember, they haven't made any specific statement about the type of the CPU. It seemed a bit of a vacuous parlor to me, to make people to jump in immediately and upgrade their CPUs. However, I'm pretty sure that a Ryzen 2700x would do its job decently. :)
 


Hi,
No, I haven't bought it yet. I was also eyeballing with a EVGA 1080 ti, which was indeed $200 less. So, after seeing that, I'm still sitting on the fence between those two cards. The performance is almost the same, but the new tech of the RTX is the thing that really attracts me. I'm learning game design, and I'm doing modeling and rendering on a daily basis. And if Unreal, for instance, would adapt that technology and plant it into their engine, it'd make wonders. But it is still a question mark, though. However, there are already 25 games which are announced to be shipped with this new tech built-in, so there is a chance.
 
All the reviews I've seen show the 1080Ti right at equaling the 2080, defeats it occasionally, barely loses in others....pretty much a statistical match...

FOr $200 less, the 1080Ti would seem quite logical, in fact...ruse the money saved an M.2 drive or more RAM.