News How Much PC Do You Need to Run Minecraft RTX?

"That's nearly double the price ($1,150) for far less than double the performance. "

$1150 is less than 60% more than the $730 of the first build. That's not really close to "nearly double the price" in comparative terms (i.e. 100% more).

Longevity-wise, I think that Ryzen 3600 will give you more usefulness over the years vs previous-gen Ryzen 2600. Add in the ~$115 by replacing the CPU, m/b, and memory from the 1080p build with the 1440p build, and you have a decent budget PC for ~$845. (Of course, some may not consider $845 to really be a "budget" PC, so we'll just call it a decent 1080p build.)
 
"That's nearly double the price ($1,150) for far less than double the performance. "

$1150 is less than 60% more than the $730 of the first build. That's not really close to "nearly double the price" in comparative terms (i.e. 100% more).

Longevity-wise, I think that Ryzen 3600 will give you more usefulness over the years vs previous-gen Ryzen 2600. Add in the ~$115 by replacing the CPU, m/b, and memory from the 1080p build with the 1440p build, and you have a decent budget PC for ~$845. (Of course, some may not consider $845 to really be a "budget" PC, so we'll just call it a decent 1080p build.)
I think I wrote the "nearly double" before I bumped the RAM up to 16GB, storage to 500GB, and used a slightly cheaper CPU. At one point it was $620 vs. $1150, but then I decided I couldn't recommend the $620 build in good conscience, because there were way too many caveats. I'll go tweak the double bit now...
 
  • Like
Reactions: 2Be_or_Not2Be
Looking forward to a similar article when Crytek does a RTX version of Crysis.
It's supposedly already in the works, though I don't know that RTX ray tracing is confirmed. I saw something about it being "API agnostic" -- which would mean it doesn't use the hardware in RTX cards, or AMD's upcoming Big Navi, which sort of defeats the point IMO. But I guess we'll see.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
It's supposedly already in the works, though I don't know that RTX ray tracing is confirmed. I saw something about it being "API agnostic" -- which would mean it doesn't use the hardware in RTX cards, or AMD's upcoming Big Navi, which sort of defeats the point IMO. But I guess we'll see.

https://www.theverge.com/2020/4/16/21223384/crysis-remastered-nintendo-switch-leak-website

“Crysis Remastered will focus on the original game’s single-player campaigns and is slated to contain high-quality textures and improved art assets, an HD texture pack, temporal anti-aliasing, SSDO, SVOGI, state-of-the-art depth fields, new light settings, motion blur, and parallax occlusion mapping, particle effects will also be added where applicable,” reads the blog post. “Further additions such as volumetric fog and shafts of light, software-based ray tracing, and screen space reflections provide the game with a major visual upgrade.”



Based on how much effort is going into reworking the graphics, it does seem like an odd choice to not use DXR.
 
  • Like
Reactions: JarredWaltonGPU
...and we've been recommending them since they first launched in 2018.
Or, you know, well before they launched. >_>

Of course, some may not consider $845 to really be a "budget" PC, so we'll just call it a decent 1080p build.
Actually, it could be considered a relatively decent "1440p build" in most other games, even without those additions, as performance will tend to be more GPU-limited at that resolution. It's probably worth adding in at least the RAM though, seeing as it only saves about $5 dropping down to DDR4-2666. $55 more for a Ryzen 3600 could be considered reasonable, but if you start adding another $55 for an X570 board, you could have instead moved up to a 2060 SUPER or 2070, if performance in Minecraft RTX is the primary concern.

As another option, one could instead drop the CPU down to a Ryzen 1600 AF for almost no performance loss, and that would bring the total cost of the build, even with the faster RAM, down to around $700.
 
Or, you know, well before they launched. >_>
No, that's patently incorrect. The first RTX GPUs absolutely launched in 2018. To be specific, the RTX 2080 Ti was launched on September 27, 2018, the RTX 2080 on September 20, 2018, and RTX 2070 FE on October 17, 2018. The RTX 2060 FE was January 15, 2019, and the various RTX 20xx Super cards were in July 2019 (July 9 for the 2060S and 2070S, and Just 23 for the 2080S). Public reviews I think were even allowed about a week in advance of the retail launch dates -- certainly the RTX 2070/2060 Super were on July 2, 2019.

Actually, it could be considered a relatively decent "1440p build" in most other games, even without those additions, as performance will tend to be more GPU-limited at that resolution. It's probably worth adding in at least the RAM though, seeing as it only saves about $5 dropping down to DDR4-2666. $55 more for a Ryzen 3600 could be considered reasonable, but if you start adding another $55 for an X570 board, you could have instead moved up to a 2060 SUPER or 2070, if performance in Minecraft RTX is the primary concern.

As another option, one could instead drop the CPU down to a Ryzen 1600 AF for almost no performance loss, and that would bring the total cost of the build, even with the faster RAM, down to around $700.
Long-term, though, I'd rather have an X570 board than X470, and I figure you either go 2nd gen Ryzen with a cheaper B450 board, or go for X570 and 3rd gen Ryzen is my feeling. 1600 AF is fine, though the 2600 does clock a bit higher. Either will work (along with lots of other CPUs), and with a modest (2070 or lower / RX 5700 or lower) GPU you're not likely to end up being CPU limited in games. But the from Zen+ to Zen 2, 7nm node, higher clocks, PCIe Gen4 are worth having. If you can justify a $500 GPU like a 2070 Super, I don't think there's any reason to stick with anything less than Ryzen 3000 and X570. They're not required, but then neither is a 2070 Super.
 

TJ Hooker

Titan
Ambassador
No, that's patently incorrect. The first RTX GPUs absolutely launched in 2018. To be specific, the RTX 2080 Ti was launched on September 27, 2018, the RTX 2080 on September 20, 2018, and RTX 2070 FE on October 17, 2018. The RTX 2060 FE was January 15, 2019, and the various RTX 20xx Super cards were in July 2019 (July 9 for the 2060S and 2070S, and Just 23 for the 2080S). Public reviews I think were even allowed about a week in advance of the retail launch dates -- certainly the RTX 2070/2060 Super were on July 2, 2019.
I think cryoburner may have been referring to the infamous "just buy it" article, which was published before any of the RTX cards had launched but recommended people buy them.
 
I think cryoburner may have been referring to the infamous "just buy it" article, which was published before any of the RTX cards had launched but recommended people buy them.
Ah, that was before my time at Tom's Hardware. (Thankfully. LOL) It's still surprising to me that 2080 Ti prices have stayed consistently above $1050. EVGA has periodically had the 2080 Ti Black selling at $999, but only in extremely limited quantities. It does make me wonder about what will happen with RTX 3080 Ti prices. I really hope they don't go up, and $999 as a maximum for non-Titan cards would be great to see. But realistically, I don't see Nvidia being that beneficent.
 
  • Like
Reactions: TJ Hooker
I think cryoburner may have been referring to the infamous "just buy it" article, which was published before any of the RTX cards had launched but recommended people buy them.
Yeah. I just had to add that since it was technically more accurate than the "since they first launched" part. The 2018 part was fine. : D

If you can justify a $500 GPU like a 2070 Super, I don't think there's any reason to stick with anything less than Ryzen 3000 and X570. They're not required, but then neither is a 2070 Super.
I was responding to 2Be_or_Not2Be's idea of adding the CPU and motherboard from the $1150 build to the $730 build, to get something that's better on the CPU side of things for around $845. My point was that at least for Minecraft RTX, it would be more beneficial to performance to put that extra $100+ toward a graphics card around the $400 range instead, like a 2060 SUPER or 2070 (non-SUPER).

I agree that one should probably be thinking about spending more than $100 or so on a CPU for a more well-rounded system once they start spending several hundred dollars on graphics hardware, but at least for this game, putting the money toward the graphics card would make more of a difference with the "budget" build. Overall, the builds in the article seem fine, apart from it probably not being worth $5 to cut back so much on the RAM.
 
Yeah. I just had to add that since it was technically more accurate than the "since they first launched" part. The 2018 part was fine. : D


I was responding to 2Be_or_Not2Be's idea of adding the CPU and motherboard from the $1150 build to the $730 build, to get something that's better on the CPU side of things for around $845. My point was that at least for Minecraft RTX, it would be more beneficial to performance to put that extra $100+ toward a graphics card around the $400 range instead, like a 2060 SUPER or 2070 (non-SUPER).

I agree that one should probably be thinking about spending more than $100 or so on a CPU for a more well-rounded system once they start spending several hundred dollars on graphics hardware, but at least for this game, putting the money toward the graphics card would make more of a difference with the "budget" build. Overall, the builds in the article seem fine, apart from it probably not being worth $5 to cut back so much on the RAM.
I assume you're referring to the DDR4-2666 vs. DDR4-3200 2x8GB configuration?

The reality is that RAM speed actually isn't that critical, even with Ryzen, unless you get low latency stuff. Your BIOS will be way more important in terms of actual performance than the memory speed, and if you know enough to care about tuning memory timings, you're probably better off just doing that on your own and "overclocking" your RAM. There's a lot of hype about how much Ryzen likes fast RAM. In practice, it really likes tuned RAM and not just high bandwidths. DDR4-2666 with tight timings will often outperform DDR4-3200 with relaxed timings in my experience.

And of course, all things being equal (timings the same), DDR4-3200 vs. DDR4-2666 is usually only about a 1-2% difference in terms of overall CPU performance (not counting memory bandwidth tests). For gaming, with a modest GPU, it will be even less. But yeah, you can upgrade the RAM to 3200 MHz if desired. There are just so many other things where it's only $5-$10 more and I was already higher than the original hope (mostly because of the $300 GPU).
 

TRENDING THREADS