Crysis 2 Goes Direct X 11: The Ultra Upgrade, Benchmarked

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

clonazepam

Distinguished
Jul 10, 2010
2,625
0
21,160
119
I just ran thru the oaMan benchmarks, which btw, isnt that supplied by nvidia as well? Anyway...



I'm apparently better than gtx 460s in sli and closing in on crossfired 6970s.. how does that even make sense?
 

clonazepam

Distinguished
Jul 10, 2010
2,625
0
21,160
119
Did another run at 1680x1050, ultra, dx11, 44avg, 34min, 53max, same run in dx9 56avg, 44min, 66max... so Im kinda scaling right along with your sli 460s at 1680x1050 but at 1920x1080 ultra dx11 im right up neck and neck with xfired 6970s.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
32
[citation][nom]jjb8675309[/nom]sweet it looks like if you do not have 580 sli you can toss this game out, outrageous requirements to max out imo[/citation]

Well, this is much like the original. So, nothing new here. And, you don't need to max out a game to enjoy it as you need to look at the full half of the cup, not the empty half. At least, it doesn't have Metro 2033's infamous DOF filter.

This is a great review for a great game, thanks toms
 

cleeve

Illustrious
Moderator
[citation][nom]clonazepam[/nom]Edit: These benchmarks seem completely off to me...
...Im besting your oc'd i5-2500k with its 570, and equaling the same results you're getting with crossfired 6970's. Something is not right on your end.[/citation]

You're simply not playing it as hard as we are. We take our benches on a level where we are fighting a lot of soldiers and blowing up cars, not standing around and measuring static FPS.

The frame rate under hard stress is what matters. That's where your hardware will choke.
 

clonazepam

Distinguished
Jul 10, 2010
2,625
0
21,160
119
[citation][nom]Cleeve[/nom]You're simply not playing it as hard as we are. We take our benches on a level where we are fighting a lot of soldiers and blowing up cars, not standing around and measuring static FPS.The frame rate under hard stress is what matters. That's where your hardware will choke.[/citation]
I agree completely. I ran first thru the subway level, zoomed thru the parking garage and then back outside, blowing up cars, killing baddies, dodging grenades, etc etc.

To suggest I would open my mouth after having stood around static... Im appalled j/k ;)
 

clonazepam

Distinguished
Jul 10, 2010
2,625
0
21,160
119
Ive been playing the game since release, and started a fresh game on post human whatever on the dx 11 / hi-res texture pack day. I think I would notice if I wasnt getting fluid results on a single 570 with dx11, ultra, hi-res textures, motion blur high etc
 

cleeve

Illustrious
Moderator
[citation][nom]clonazepam[/nom]I think I would notice if I wasnt getting fluid results on a single 570 with dx11, ultra, hi-res textures, motion blur high etc[/citation]

Perception is great, but it's subjective.

Only a benchmark can prove something.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
Don, can you confirm the 460s used in the article were reference-clocked cards?
Ah go on, include a coupla FTWs or SSCs... ;) Not an entirely unsensible suggestion
given nobody in their right mind would buy a reference 460 these days. :D The most
logical options would be either the EVGA SSC for its much higher 850 core clock
(that's a heck of a jump over a reference card, enough to completely change the
SLI positioning in the charts), or the Palit Sonic for its 2GB RAM, though IMO the
SSC makes most sense - 2GB RAM only helps for super high res/detail modes, but then
at that kind of res one would want to have a faster card anyway.

Just a thought...

Ian.

 

mchuf

Distinguished
Jul 16, 2010
204
0
18,680
0
[citation][nom]apache_lives[/nom]why would you be using a 32-bit os for anything in this day and age?[/citation]

Why to play old 16 bit games of course! GOG hasn't released everything you know. Although I still have Win 98 and Win XP machines for things like that. I'm lucky my wife let me set-up the basement as a gaming room.

Oh and I guess compatibility issues with older programs.
 

kingius

Distinguished
Oct 15, 2003
39
0
18,530
0
Show the six core AMD and don't be afraid to knock Intel off the top spot on your graphs, unless money is changing hands and Intel owns you! I notice my previous comment is nowhere to be found, strange that, isn't it!
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310
22

I think when it comes to gaming, there have been other tests that show a nicely OCed i5 will take out a 6 core Phenom. However in things like photo shop the phenom might win.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
kingius, I posted a detailed reply but it barfed somehow... can't be
bothered typed it all again, suffice to say the data you need already
exists, eg.:

http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-12.html
http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/15

SB beats the 1100T pretty much for every single test, with just
one exception by a small margin (Alien vs. Predator). SB is simply
a better design, period. Also uses less power than the 1100T.

Naturally people are looking forward to BD, but then we have 6-core
SB-E, etc. coming along later aswell. AMD just seems to be one step
behind all the time; if that continues to be the case, hopefully they'll
at least be able to offer competitive value even if they can't offer
the best absolute performance. The market needs the competition, otherwise
Intel can charge whatever it likes for it's fastest products.

Kinda irritating that X58 CPUs are still so expensive. Why does a 950
still cost so much more than a 2500K?

Ian.

PS. Don't mention overclocking as that'd just make the 1100T look even
worse, given SB's much higher oc'ing headroom.

 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310
22


Guess I was wrong then lol. And I do hope bulldozer is good. I would hate to see bulldozer not be much of an improvement and Intel gain more ground, like you said we need the comp. I will continue to be a AMD customer for now as I mostly use my PC for gaming and the internet. It seems wile there is a obvious advantage to sandy bridge its not to big a difference IMO. By this I mean games still seem to be more GPU dependent than CPU and so wile my 965 is slower than a sandy bridge for sure I only see drop of 3-4 fps when looking at a bench with my same GPU and a sandy bridge. And being that I can max most games out and stay pegged at 60 FPS (vsync) and dont plan on having a 120hrz screen anytime soon getting more than 60 FPS will not matter much to me. In oder for me to go team blue I would have to see a major difference with the same GPU, in other words AMD would have to not have a CPU capable of me getting a solid 60 FPS by upgrading my GPU.
 
G

Guest

Guest
so i don't know whats up with your benchmarks at 1080 but my pair of 5770s all settings ultra + dx11 + high res gets 40 high, 27 low. with 3 Ghz 940, 8GB Ram. its still playable even when there's like 10 aliens all blasting away with me firing missiles back. I find the game pretty optimized.
 

clonazepam

Distinguished
Jul 10, 2010
2,625
0
21,160
119
It's a pretty game any way you run it. The most standout features I noticed were the detail in the cracks of the pavement surfaces, the tire tracks on the dirt roads, and when moving from inside to outside, the same effect as your eyes with everything bright and washed out slowly coming back into focus. It's great and I'm so happy they decided to get these patches out. The 4th difficulty level really adds to the re-play value of the single-player campaign too.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
cburke82 writes:
> Guess I was wrong then lol. ...

To be fair to AMD, many apps aren't yet written well enough to exploit
6 cores. I suppose an interesting comparison would be a 2500K vs. a
970, but I notice now that some reviews don't bother including that
many X58 results anymore, which is a pity. It's almost as if reviewers
are embarassed by X58 now because of what SB can do (ahh, all those
long months of X58 rOoLz, P55 is for wimps!), which is a little odd given
that the same articles do often include P55 data.


> ... And I do hope bulldozer is good. ...

Indeed. If not, there'll be no price competition. I think people are kinda
expecting BD to offer good multi-core performance purely by having
lots of cores (8+?), which is great for rendering and other highly threaded
tasks, but not as good as SB (or whatever by then) for single-core or tasks
that still can't exploit more than 4 cores, which does include most games.
Would be nice to be surprised though.


> ... I will continue to be a AMD
> customer for now as I mostly use my PC for gaming and the internet. ...

I use both. Sometimes the choice one makes is not based entirely on
absolute performance, etc. Budget might be a factor (2500K is great,
but what if one can't spend that much?), all sorts of things. When I
bought my 6000+ a few years ago, it was the very week AMD slashed its
prices by 50%, making the 6000+ waaay cheaper than the equivalent
performance Intel at the time. Plus, I couldn't find a 775 board that
supported DDR2 RAM at full 800 speed, after taking into account the
other features I was looking for (I wanted PCIX support for SCSI RAID).

Alas, I did get kinda burned though in the end - ASUS never bothered to
release a BIOS update for the mbd I bought (M2N32 WS Pro) so it can't
use any Phenom II CPUs which was a real letdown for a costly supposedly
'pro' series board (my gf's el cheapo Asrock cost 75% less but it can
use up to the Ph2 970), thus eventually I bought a P55 for my next upgrade.

IMO vendors should be forced to release BIOS updates for their boards
to support all CPUs which the board can theoretically take, AM2
inparticular re using Phenom IIs. Too late now of course but I shall
continue to growl & harumph. :}


> ... It
> seems wile there is a obvious advantage to sandy bridge its not to big
> a difference IMO. ...

Depends what you're doing. Some of the test results I referred to in
those links show a HUGE difference (media encoding, rendering, etc.)
It's less of an issue for gaming, but again it depends on the game.


> ... By this I mean games still seem to be more GPU
> dependent than CPU ...

That's bound to be the case, but as games become ever more complex it's
clear a good CPU is becoming ever more important. Varies by game, detail
level, resolution, etc. For those with older systems, it's often hard to
decide if a particular CPU/GPU upgrade would be worthwhile, which is the
area I'm exploring atm.


> anytime soon getting more than 60 FPS will not matter much to me. In

And in the end that's what really matters. If it satisfies your own
needs then that's the key.


> oder for me to go team blue I would have to see a major difference with
> the same GPU, in other words AMD would have to not have a CPU capable
> of me getting a solid 60 FPS by upgrading my GPU.

Eventually one's current hw will inevitably become obsolete, but it's
surprising how well older systems can perform when properly specced.
This is what I'm testing atm, with a range of mbds/CPUs/GPUs. Some
people take upgrading older hw to ludicrous extremes though, eg. paying
more than 160 UKP for a used Q9550...

Ian.
 

cleeve

Illustrious
Moderator
[citation][nom]mapesdhs[/nom]Don, can you confirm the 460s used in the article were reference-clocked cards?Ah go on, include a coupla FTWs or SSCs... Not an entirely unsensible suggestiongiven nobody in their right mind would buy a reference 460 these days. [/citation]

We always use reference clocks.

As far as 'nobody buys reference', well, in many cases meaningful overclocks raise the price into GTX 560 territory... and nobody in their right mind should be choosing an overclocked card over the better model. :)
 

eddieroolz

Splendid
Moderator
Crytek deserves a big pat in the back for the DX11 patch. Sure, it was a shame that it didn't ship with the game at launch, but its always better to take time and deliver a fine product than rush it.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
Cleeve,

I didn't say nobody buys reference period. :D I said nobody in their right
mind would buy a reference GTX 460 1GB if that's the model they were contemplating
atm. It does make it a bit confusing for people though - look at a typical seller
and often the only models they offer are oc'd editions, or as I posted way back the
cheapest 460 is often an oc'd model (typically a 725 core or similar).

I agree about prices encroaching upon the next tier of cards, but in this case? I'm
not so sure. The SSC's 850 core clock is higher than the cheapest GTX 560, while
the cheapest 560 with an 850 core clock is the ASUS DirectCU II which here is 16 UKP
more (14%) than the SSC. ie. _real_ products muddle the picture compared to
reference cards. I'd just be intrigued to know how an SSC compares to a 560 with an
850 core (on paper they look like virtually the same product to me), or a 'standard'
560 with an 810 core. The 560s tend to have slightly quicker RAM, but the core clock
is more important.

Yes, beyond the 150 UKP mark then a 560 is more logical, so now there are a whole
load of 460 options which make no sense anymore, including the original FTW (it's
the SSC that still looks good). The KFA2 560 with a 905 core clock looks quite good
at 159 UKP.

Ah don't ya just love obsolescence. :D

Anyway, my point is, comparing an SSC to typical 560s with an 810 core (similar
price point), I don't think the latter would be the better model given the specs
of each.

Or to put it another way, back when the 250 came out, and the 9800GT, there was
a lot of huff & puff about the original 8800GT just being rereleased multiple
times in different ways. To me it looks like this is happening again, but nobody
seems to be bothered this time round.

An 'older' 460 at 139 UKP (which includes Mafia II free btw ;) should not be
quicker than a 'newer' 560 at 145 UKP, but I suspect it would be. Seems like
newer models (sic) are being released without any significant extra performance,
or possibly even slower.

Ian.

 

AbdullahG

Distinguished
Jun 17, 2011
2,798
0
20,960
61
My GTX 560 Ti couldn't max out everything as it did before the patch. Well, I returned that as it suddenly stopped working and bought a 6970 and a X4 945. I'll see how things work out :)
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310
22

If your only getting 30 FPS with v sync on then turning it off is not going to magically get you PC to produce 10-15 more FPS.....
 
Status
Not open for further replies.

ASK THE COMMUNITY