Can Nvidia gtx 570 run Crysis 3 smoothly on High Graphics?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Iamlecter

Honorable
Mar 17, 2013
9
0
10,510
Hey! I am thinking about buying geforce gtx 570 for my desktop as it is well within mu budget. But before buying I want to make myself sure whether gtx 570 can play Crysis 3 smoothly?
I have intel core i3 3.3ghz 2nd gen processor with 4 GB ram. Previously I ran crysis 2 on my PC and it played fine on GAMER settings and kind of ok on ADVANCE and terrible on HARDCORE settings. As Crysis 3 looks awesome and I dont want to compromise anything. If geforce gtx 570 can't play Crysis 3 on high graphics, can you suggest me any other graphic card around $500 for my PC. Please reply fast. I can't wait any longer to play it!!!!
 
Solution
G
Ya and my 965 be phenom was bottlenecking a gtx 480 so had to upgrade and thats on par with the i3.You dont get it a 7970 would get a large chunk of fps chewed up running on that.he could get a gtx 570 and get the same performance as a 7970 on that rig.wasted money going the 7970 route anyway u look at it.
Im pretty sure now that i think of it that i had the 140 watt 965 be have fun trying to hit 4 ghz with one of those.The new revisions did infact make it easier to get slightly higher numbers
 


Both these pro reveiws and everyone in the threads were able to hit 3.9 or higher on a 965. I haven't seen a forum where the majority were stuggling with that. I apologize if it seems like I'm arguing. I feel a bit bad for you, since it looks like you got a bad chip.

http://www.overclockersclub.com/reviews/phenomii_965/3.htm
http://www.extremeoverclocking.com/reviews/processors/AMD_Phenom_II_X4_965_7.html
http://www.techpowerup.com/forums/showthread.php?t=158262&page=2
http://www.tomshardware.com/forum/260881-29-overclocking-phenom-black-edition


Not sure where you're getting your info on the 480 vs. the 7850. At launch, the 480 was a bit faster at stock speeds, but the 7850 won when both were overclocked. they're in the same tier on tom's card rankings, so the differences between them are tiny.

Some folks hit a stable 50% core overclock on the 7850's (1300mhz). According to the OP there, it's close to his 580 when oc'd to only 1050. http://forums.overclockers.co.uk/showthread.php?t=18389760

After that AMD's driver releases gave the 7k series more performance gains than nvidia's new drivers gave to their respective cards. With Catalyst 12.1 and later, the 7850 is a bit faster than the 480 even at stock speeds.

I'm just saying there was something funky going on in your build. Dunno what it was, but your experience was outside the norm. Sorry you had to ditch your rig.
 


Based on this, he is clearly wrong, otherwise, how is the i7 so much higher than the i5?:
Crysis3-CPU.png
 


Clearly you don't comprehend what you posted. The i7 in that test has 6 physical cores. Hyperthreading gives that chip TWELVE logical cores.

That chart reinforces my point, thank you for posting it and saving me the effort. Since 6 cores of the 8350 are used, it performes better than the 4 core I5, but the 6 physical core I7 comes out on top.

I rest my case.
 
I'm not so sure. Looking at this:
CPU_03.png


Which does include 4 core i7's with HT, it may or may not be helping if you look only at the i7-3770k, but look at that i7 920. If HT doesn't work, how is the i7 920 doing so well at a stock clock of 2.66Ghz. (This is good new for me, as I still use an i7 920, but OC'ed).
 


you got it wrong. the 8350 is not performing better than the 4 core i5.



 
Unfortunately there haven't been any satisfactory CPU benchmarks done by a reliable site yet... Seriously, get with the program - the gamers that will max this game will have an i5-3570K or an i7-3770K clocked as high as they can get it. For my part, I'd really like to see the difference in minimum and average framerate (frametimes would be best) between these two CPUs at 4.5GHz. I'm sure there are tons of gamers out there in my position, who chose the bang-for-buck i3, planning an upgrade to an unlocked i5 or i7 when it became necessary. Well with this game, and the definite possibility of future games becoming more and more multithreaded, it looks like the time is at hand. (The threat of cross-platform games developed for octo-core consoles with 8GB VRAM kind of scares the dust out of my rig.)

Both the CPU benches shown in this thread have a locked i5 compared to either a 6-core or higher-clocked i7. The difference shown in the TechSpot benches could really be explained by the slightly higher clockspeed and slightly more L3 cache. I have seen unofficial benchmarks showing the i7-3770K benefiting from using HT vs. turning it off (effectively turning it into an i5), but how reliable are they?
 


Lol, ok. If you want to nitcpick, the core I5 has a better minumum framerate, but the 8350 posted a slightly better average framerate. Tom's hardware ordered it by average framerate instead of minimum, conclude what you will from that.

The fact that the 8350 is even matching a core I5 in a game when both are at stock speeds reinforces my point as well.
 


It does show it likes more threads/cores. It doesn't mean it doesn't like HT, as the other chart I showed has the 4core i7's higher, and the i7 920 at 2.66Ghz surprisingly high.

I wish more sites used the mins, the i5 clearly would give a better experience based on those numbers. The average difference is less than 1 FPS, but the minimums are much different. Consistent is better than erratic.
 
Here's a better chart to examine hyperthreading. There's only a slight difference here between a 2500k (4 core, no HT) and a 2600k (4 core, HT) which could be accounted for in the 100mhz different between them (could also throw in larger cache, but the point is hyperthread has insignificant impact, if any, on the 2600k's performance). Once again, the core I7's with 6 physical cores are way out in the lead, with the 8350 coming in second.

0nIkCAb.jpg
 


Apologies. I misspoke. Crytek declined to optimize Cryengine 3 for hyperthreading, and the benefit of the logical cores is dramatically less in Crysis 3 than other games.
 


the 3770k is clocked 300 mhz higher than the 3470. Hyperthreading has insignificant, if any impact on it's performance as well.
 


this is the kind of dude that should not and must not give out GPU/CPU advice.

careful when you use adjectives as people more knowledgeable than you exist around here. like what the bystander dude said, 10fps difference in mins matters more than a .3 on average.

out-perform is a big word you know. not the kind that you'll describe a .3 difference.

i'm not even nitpicking. it's just that you are blatantly misinforming.
 


In the chart I posted, the 8350 has better mins than the 2500k as well as better average. While I do agree that mins are more important, the assertion that the 8350 "outperforms" the I5 is still valid. While the 3570k very likely has better mins than the 2500k, I highly doubt it's the 10 fps worth it would need to catch the 8350 in that particular benchmarking. The 8350 has better average on nearly all Crysis 3 benchmarks. So, yes, I still call your correction nitpicking.

The point I am making is that HT shows little benefit in Crysis 3. whether the I5 or the 8350 is better is moot.
 



glad you're not misinforming anymore.

but wait.

disregarding your 2nd chart and completely looking at the the THG scores, an i5-3570 would pass the 8350 (sad that your 2nd graph don't include IB's there, don't know if turbo's enabled as well) as a i5-3550 is >= 8350.

i5-3550 >= 8350
i5-3570 > 8350

and wait till you get to clock for clock performance.
 
I don't see the point at which I was misinforming. that post reiterates what I already said, except with more evidence.

Also, I can't be the only one who noticed the elephant in the room with min framerate benchmarking. I should clarify my earlier stance: min frame rate can be more important to the gameplay experience, but only when the minimum framerates happen with some frequency. A large one time dip to the minimum is quite different to our gameplay experience than repeated drops to or near the minimum, but given the same minimum, they will show up identically in the charts.

I'm sad the chart didn't include IB as well, but drawing a conclusion disregarding evidence that didn't agree with your predisposition is not a strong argument.
 
It does provide some improvement at least. Obviously it uses 6 threads at most, whether it is 6 cores or 4 with hyperthreading. And under normal circumstances, hyperthreading is always going to be low percentage, as it is stealing power from the primary core's thread.

Under normal situations, hyperthreading only gives 10% performance boosts in gaming anyways, so it's not really that bad, though not necessary either.
 
Status
Not open for further replies.