Here Are The System Requirements for Crysis 3

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Crysis 3 requirement is making me laugh, especially when 2600k is compared to 4150. At this time, its too much hungry when decent gpu cant max it out and high-end gpu is expensive.
 
[citation][nom]mubin[/nom]Crysis 3 requirement is making me laugh, especially when 2600k is compared to 4150. At this time, its too much hungry when decent gpu cant max it out and high-end gpu is expensive.[/citation]

Needing incredibly high end GPUs to max a game out at high resolutions doesn't necessarily mean that the game is "too hungry" assuming that you meant that it's not optimized enough by "too hungry". Simply having a ton of features and settings may let even extremely optimized games need higher end systems than we have available today be necessary to be maxed out at even popular resolutions.
 
I just played Crysis and Warhead for the first time, and they're total rubbish. Not graphically impressive anymore, very glitchy FSAA (and shaders in Warhead), and barely pulls 40-50 fps on a 7870. Is it weird that I find the Ubi-made sequels to Far Cry better than any of the games Crytek has made, from both the tech and gameplay standpoints?
 
i hoped more people would realize this is just a marketing scheme to get the pc gamers interested in the game again. it obviously wont need a 680 and and an i7-2600k to run on ultra smoothly.
 
[citation][nom]jason moyer[/nom]I just played Crysis and Warhead for the first time, and they're total rubbish. Not graphically impressive anymore, very glitchy FSAA (and shaders in Warhead), and barely pulls 40-50 fps on a 7870. Is it weird that I find the Ubi-made sequels to Far Cry better than any of the games Crytek has made, from both the tech and gameplay standpoints?[/citation]
despite agreeing crysis 1 needing ridiculous specs at its time, Id have to disagree that its not graphically impressive. that game still looks better than most games out today and it was made back in 07.
 
I believe the reason for even suggesting the 4150 (or should I say 4170?) in the first place is because it's the best avenue for gaming with Bulldozer due to its clock speed being the highest in the range. However, the replacement 4300 series doesn't share this distinction, plus its L3 cache has been cut in half which is even more of a reason why the 4170 might actually perform better.
 
[citation][nom]silverblue[/nom]I believe the reason for even suggesting the 4150 (or should I say 4170?) in the first place is because it's the best avenue for gaming with Bulldozer due to its clock speed being the highest in the range. However, the replacement 4300 series doesn't share this distinction, plus its L3 cache has been cut in half which is even more of a reason why the 4170 might actually perform better.[/citation]

The FX-4300 performs a little better than the FX-4170, just so you know. 4MiB of L3 is plenty when it already has 2x2MiB of L2 that is several times faster and Piledriver's enhancements over Bulldozer let the frequency disadvantage be nullified. The lower power consumption is another bonus. Regardless, it was probably suppose to say FX-8150, not FX-4150.
 
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300.html

It's not universal; the higher clock speed (as well as the extra cache, possibly) of the 4170 can actually get it some wins. However, as you rightly pointed out, power consumption is lower.

The 6200 vs 6300, on the other hand, is a hands-down win for the 6300 despite the lower clocks. The 6200 could be hitting thermal limits and throttling itself, whereas the 6300 doesn't appear to have that issue.

In most cases, the Piledriver equivalents are clocked higher and easily outperform their Bulldozer brethren, however that merely takes the entire line from "specific usage case only" to "bang for the buck, if you don't mind the power and heat". Shared resources are all well and good, however it does limit how much you can power down and when. I'm very interested in Steamroller and what it can possibly do to correct matters.
 
[citation][nom]amigafan[/nom]My config can handle this, but the problem is... the fans will get REALLY noisy! And by REALLY noisy I mean my neighbors will know when I'm playing Crysis 3 (or will wonder why I'm "vacuuming" so often)[/citation]

Go get a GTX670. It is virtually silent comparing to my old GTX570 superclocked. The 570 used to warm the room by 5 degress after a few hours of crysis 2 and now the 670 doesn't heat up the room a bit.
 
[citation][nom]anony2004[/nom]After letting Crysis 2 run on modest hardware, have they really lost it?[/citation]

I think they got fragged by the hardcore FPS masses for precisely not pushing the PC requirement hard enough. That said had Crysis 2 PC came with Direct X 11 and high res texture pack as the default setting, even the then fastest GTX 580 wouldn't make 60fps in quite a few levels. The direct X 9 version was deceiving.
 
Guys Please Answer this:
Can crysis 3 Utilize 8 cores Like the 2600k?
(i mean, this sound redicilous, like they are pumpiing & pimping the sys req.)
 
Since you might be looking for the final release's performance, I think you'd have to wait for the release in Feb. I think. :) But the specs are indicative... We'll just have to wait and see...

I for one am looking forward to performance numbers that actually scale up with core count. First off because it would be one more step towards more threaded (and hopefully optimized) games, and maybe AMD having a fighting chance with their octa-cores (as in even beating out the i5's just because of sheer core count due to well-threaded performance). :)
 
I truely doubt you need a xfire/sli 680's or 7970's to get high settings. If so i honestly cant imagine that many people going out and buying a new gpu or 2 just for one game.
 
[citation][nom]s997863[/nom]here are the requirements for crysis 3:- a console.- a tolerance for mediocre games. it helps greatly if you haven't been playing PC FPSs for 18+ years and can be wowed by rehashed concepts as if they were new, poor level design & controls, interactive movie experiece (i.e. more lame cutscenes/story than freedom/gameply, with stupidly easly gameplay to appeal to joypad users who don't have time to try a hard game for too long before finishing it & buying the sequel ...)Pretty high/fussy requirements for a series that's ratings have only gone down since Warhead. I never even tried crysis-2, but I know I'm in the minority. most friends I know were DESPARATE to try, hoping that maybe the critics & videos/screenshots were wrong, and the game probably sold well anyway and the suits are testing the waters again.[/citation]
crysis2 isnt as bad as u think playing it will give u an idea it also works low configs without a problem....

 
[citation][nom]merandos[/nom]got the 17-2600k, but 680 ? seriously ?[/citation]

You get it needing an i7-2600K to meet certain requirements despite the i7 generally not outperforming a similarly clocked i5, yet you don't get it needing a graphics card that is known to outperform a long of other cards except the Radeon 7970, well-clocked Radeon 7950s, and GTX 670s considerably to significantly? I'd expect one to be surprised about the i7 being listed, not the GTX 680.
 
[citation][nom]blazorthon[/nom]You get it needing an i7-2600K to meet certain requirements despite the i7 generally not outperforming a similarly clocked i5, yet you don't get it needing a graphics card that is known to outperform a long of other cards except the Radeon 7970, well-clocked Radeon 7950s, and GTX 670s considerably to significantly? I'd expect one to be surprised about the i7 being listed, not the GTX 680.[/citation]

it could be the first game to effectively use threads.
 
[citation][nom]alidan[/nom]it could be the first game to effectively use threads.[/citation]

Well, IIRC, BF3 MP did have a decent difference between a stock i5 and a stock i7 (same core count and architecture with similar frequency), granted overclocking the i5 still got performance well in excess of what was necessary. Still, I see your point. It may be the first game to not only use that many threads, but actually important.
 
Status
Not open for further replies.