Crysis 3 Performance, Benchmarked On 16 Graphics Cards

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Gulli

Distinguished
Sep 26, 2008
1,495
0
19,310
Just a couple of FYIs for anyone reading this:

1) the first level of the game (in the rain) has a lower FPS than all the other levels, this is apparently a bug that has to do with the rendering of ropes. So if your card only puts out 25 FPS here it may still go above 30 FPS in the other levels.

2) Don't use anything higher than 2x SMAA, turn "game effects" and "water" down to "high", you will have noticeable higher FPS without a noticeable drop in image quality. Except for the first level I get over 30 FPS all the time with an HD 7870 and my old, non-overclocked, i7 920.

3) New graphics card drivers will make a difference as they did with the previous Crysis games.
 

danielmunhato

Distinguished
Oct 21, 2011
16
0
18,510
why oh why the companies are making new games extremely CPU dependant? i know it´s all console ports, but, if continues this way, one year from now we are going to need a 16 core cpu to play crappy games. All the GPUS are bottlenecked in this article.
 

azraa

Honorable
Jul 3, 2012
323
0
10,790
[citation][nom]will1220[/nom]Why would you include the top of the line amd, middle of the line intel (ivy bridge i5) and not the top of the line ivy bridge i7 3770k?????????[/citation]
They were testing at stock clocks.
There is no difference between an 3570k and a 3550 at stock clocks, they are essentially the same. Same thing stands true with the 3770k.
 

gsxrme

Distinguished
Mar 31, 2009
253
0
18,780
What the hell Tomshardware!

I want to see a 2600k or 2700k @ 4.8-5Ghz on that CPU Benchmark. I sure as hell know its easy to get to 4.8Ghz and I know it'll smash that hex core junk at lower clock rates. Could be why you choose not to post the the 1155 i7!!!! ehmm......

I'm only clocked @ 5.1Ghz 1.5v for what? over a year now? Thank you Koolance, DangerDen. I still built enough 4.8Ghz 2600k for my friends with H80 24/7 temps @1.42v

Hex cores aren't what they should be until we get Ivy Bridge-E

other than that, good write up.
 

mohit9206

Distinguished
don't spend tons of money to build a pc to play this game at max settings. all this game has going for it is great graphics.everything else is poor.the new tomb raider game has just as good graphics and also has a good story and great gameplay
 

SuperVeloce

Distinguished
Aug 20, 2011
154
0
18,690
[citation][nom]azraa[/nom]They were testing at stock clocks.There is no difference between an 3570k and a 3550 at stock clocks, they are essentially the same. Same thing stands true with the 3770k.[/citation]
Nope, ivy i7 is different from i5 (at stock), no HT. It could be between ivy i5 and sandy-e or it could be even on pair with sandy-e...
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
[citation][nom]mouse24[/nom]Some people don't understand that peoples opinions/gameplay/genres are different.[/citation]

My post had nothing to do with people's preferences. It has to do with wasting money on games and hardware that aren't that much better than those of a few years ago. We're simply not making the same advances that we were years ago and the simple fact is that the majority of people playing PC games are doing it on five year old hardware.
 

badtaylorx

Distinguished
Apr 7, 2011
827
0
19,060
[citation][nom]JonnyDough[/nom]Just like Crysis 1 and 2, I still don't care. I play TF2, Skyrim, and any other game that isn't brand new because I refuse to pay $60 for a game that I can't return to the store and I don't have time to play all the titles out there I want to anyway. Anyone who has to jump on the latest and greatest bandwagon doesn't understand what "good gameplay" is.[/citation]

guess your name should have been "Jonny(aint got no)Dough"
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
[citation][nom]badtaylorx[/nom]guess your name should have been "Jonny(aint got no)Dough"[/citation]

Actually, I own about four working gaming PCs and three are crossfired. My latest is an Intel Core i5. I'm selling off the old ones, I just enjoy building PCs. In truth I'm quite frugal. That is half of how you GET money. Talk what you know bro.
 

omnimodis78

Distinguished
Oct 7, 2008
886
0
19,010
I've only played the game for a short time, but I agree that the game feels artificially taxing on my system. Yeah it looks good, looks damn good, but it's not amazingly revolutionary - and I sense that there's some unpolished coding going on that might be more so responsible for the performance.
 

wcg66

Honorable
Mar 5, 2013
2
0
10,510
It's worth playing around with the settings in Crysis 3, especially the AA settings. You can lower the AA and it still looks gorgeous and you get way better frame rates. Note these benchmarks are not at maxed out settings - even a GTX 680 will struggle at max settings. I have a 3930K @ 4.5 GHz and 2xGTX 670 in sli and I can't get 60 fps at absolute max settings. However, turning down the AA makes a huge difference I can easily play at 60 fps locked in (Precision X) without maxing out the GPUs.
 

wcg66

Honorable
Mar 5, 2013
2
0
10,510
[citation][nom]gsxrme[/nom] I sure as hell know its easy to get to 4.8Ghz and I know it'll smash that hex core junk at lower clock rates. Could be why you choose not to post the the 1155 i7!!!! ehmm......I'm only clocked @ 5.1Ghz 1.5v for what? over a year now.[/citation]

 

corvak

Honorable
Jan 31, 2013
58
0
10,630
The relevance of this test isn't really related to the gameplay.

Testing an array of cards against the most demanding game on the market gives an idea of what sort of performance can be expected in most games with those cards. When battlefield 3 came out, they tested 30 cards on it. It is similar to a typical GPU benchmark, but testing on new games show how drivers improve the performance of older cards. After new drivers come out, these numbers can also be used as a comparaison to see what sort of improvements have occured.

The results give an idea of what the cost of those cards are "worth" in terms of performance. A good example is showing that going from a GTX 670 to a GTX 680 isn't quite worth the $100 you'll pay.
 

DryCreamer

Distinguished
Jan 18, 2012
464
0
18,810
[citation][nom]pauldh[/nom]Ah, be very cautious drawing definitive lines in the sand! There is no question the Pentium bombed here, however that doesn't mean it bombs equally on another system after those other factors all start to change.[/citation]

I saw VERY similar numbers compared to what you got with my i3 3220, I was using driver 314.07, and especially when it pertains to the minimum frame rates. in fact, my GTX 560Ti and GTX 670 posted almost the same avg and min FPS in my runs using the i3 cuz it couldn't keep up with the faster Keplar card.

it could be that the driver is pulling way to much overhead? IDK, but I was definitely surprised at how Cry3 loved the more threads the better! It was kind of nice in Cry2 because as long as you had at least a $100 CPU, the bottle neck didn't seem nearly as severe.

I do know the game has only been out 2 weeks, but it was an eye opener to me.

Dry
 

DryCreamer

Distinguished
Jan 18, 2012
464
0
18,810
[citation][nom]atminside[/nom]I enjoyed reading this, but why didn't you include the Radeon HD6850 and HD5850?[/citation]
makes me wish I hadn't sold my 5850 :/

Dry
 

doogansquest

Distinguished
Jul 29, 2010
86
0
18,630
Crysis has never been a good game. It's a glorified benchmark. People read the articles, and/or ask about hardware in reference to Crysis when none of the quality titles out there require the same hardware. Crysis is, at best, tier 3 in the FPS genre.

You don't need a GTX 680 or HD7970 to play modern games on high settings. Am I the only one who finds 3 monitors annoying too? I hate the breaks in the screen. I just want one large monitor.

Anyway...
 
[citation][nom]jack1982[/nom]Interesting to see the HD 7950 and GTX 660 Ti being generally very close in frame time variance. I was looking at the "Radeon HD 7950 vs. GeForce GTX 660 Ti revisited" article from The Tech Report and they showed the 7950 having massive variance compared to the 660 Ti in every game they tested. Did AMD fix this somehow in the last 3 months?[/citation]
no, toms just doesnt test it the same way and they get wildly varying results. in one part the 650ti does better than even the gtx670. Either that or they did the tests right and pulled the data out of a hat and threw it around the room and tried to match it up as they thought right.......
 
All the AMD driver/microstutter nonsense and it's nvidia SLI and titan dominating frame variance lol.

7970s.win on performance AND frame latency? And price? And free games? All the nvidia fanboys must be going mad
 

mlopinto2k1

Distinguished
Apr 25, 2006
1,433
0
19,280
Just threw in my original Crysis CD for an install. Played and beat Crysis 2, loved it except the world limitation. Can't wait to afford Crysis 3... then to get FarCry 3. Yay.
 
[citation][nom]jack1982[/nom]Interesting to see the HD 7950 and GTX 660 Ti being generally very close in frame time variance. I was looking at the "Radeon HD 7950 vs. GeForce GTX 660 Ti revisited" article from The Tech Report and they showed the 7950 having massive variance compared to the 660 Ti in every game they tested. Did AMD fix this somehow in the last 3 months?[/citation]

Shortly after that tech report article came out and other sites started looking at frame variance/latency AMD addressed it with new drivers. Tech report shows the 660Ti and 7950 on par for frame variance with the new drivers.

AMD has an article on it, but basically the issue was brought to their attention and they immediately addressed it and said they would continue to do so for future games. Clearly they have or nvidia, including titan, would not be the ones dominating the horrible end of the frame variance scale.
 

The first Crysis was actually a pretty good game. The second, not so much. The third sounds like a polished version of the second game, at best. Meh.


Well, you can't depend entirely on the reported numbers for SLI and Crossfire here. Single-card numbers should be accurate or at least representative (so 17 ms may not be precise, but it's still going to be about half as bad as 34 ms).
 

davemaster84

Distinguished
Jun 15, 2011
464
0
18,810
Crytek rocks man, they really acomplished what they promised, we'll no longer feel bored about all games getting kicked by our high end pc's , now we'll have to figue a way to improve it so it actually makes it ok hahaha
 
Status
Not open for further replies.