Crysis 2 Goes Direct X 11: The Ultra Upgrade, Benchmarked

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

K-zon

Distinguished
Apr 17, 2010
358
0
18,790
The thing i still dont get but seems just as reasonable to say otherwise, is how can you say one core, out of the multi-core proc still fair as a playable option of choice, even though in terms of additional cores basically says you get the same amount of processing additional for each one for the numbers of cores in one proc, but stil able to play one processor in the cpu core. Without saying that just one one proc thats one core cant run at higher speeds of the same software? The grpahic cards are obvious differences, given most run i think from the 400mhz plus range and on in terms of core processing and memory speeds. Let alone to say a difference in DXs in reasonable enough to say one runs better then the other for the specs. Given Crysis has some demanding interests to run even low to play. At a point says no. Given though one core within a multi-proc though can run the higher-end cards on itself. With one DX in mind or more.
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860


The funny thing is... that's exactly what happens. ~25-30 FPS with V-Sync, turn it off and it's stable 30-40.
 

mapesdhs

Distinguished


Er, yes it can. That's exactly the kind of effect frame sync can cause. If the renderer is sustaining (say) 59fps single-buffered to a 60Hz display, then switching to
double buffered will drop it to 30. Likewise, if the renderer single-buffered is doing 29, then turning on frame sync will drop it to 20, ie. the next lower integer
divisor of the screen refresh. This is why 72Hz or higher can be more useful for double buffered modes, and why SGI had the old saying, "60Hz, 30 hurts."

The price you pay for not having frame sync is tearing artifacts during fast motion.

Ian.

 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
Would have been nice to see stereoscopic 3D benchmarked too... with the built0in side-by-side mode, and also with Nvidia's stereo software.
 

brotoles

Distinguished
Jul 18, 2011
26
0
18,530
[citation][nom]f-gomes[/nom]Hey, Brotoles, you're wrong. Rendered frames per second have NOTHING to do with refresh rate. One thing is how fast your system can render images. Other thing is how meny times per second your display is refreshing. V-sync is good for saving power and avoiding tearing (when you have part of the present image and part of the previous image being displayed at the same time), but it will NEVER have any implication on refresh rate.[/citation]

f-gomes, v-sync and refresh rate are correlated in the fact that your frame rate will never go above your refresh rate when v-sync is turned on.

My graphic card is more than capable of doing 60 fps on the menu screen (when i turn v-sync off, my frame rate reach hundreds of frames per second on the menu screen), but when I turn directX 11 on and restart the game, FRAPS measures 50 frames per second on the menu screen, which means that the refresh rate is indeed 50 Hz. When I go back to DirectX 9, the frame rate also goes back to 60 fps.

Now why this has to do with the 3D Glasses... When you use 3D, v-sync MUST be on to synchronize the refreshing of the screen with the shutter glasses. My shot is that usually the 3D monitor has a 120Hz refresh rate (60 Hz for each eye), but when the game goes to 50 Hz, the monitor would have to have a 100 Hz refresh, and maybe this is an unusual refresh rate. Or maybe the graphic driver is not written properly to support this refresh rate. It's just a hunch, but i think that the reasoning makes sense at least. :)

Thanks for the attention,

Brotoles

My system specs:
Intel Core-i5 750 processor
AMD Radeon HD 5850 with Catalyst v11.6
6 GB Corsair RAM
Gigabyte P55A-UD4P motherboard
XFX 750W Black Edition Power Supply (Seasonic circuitry)
Windows 7 Ultimate 64-bit with all the latest updates
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310

All I know is this. If I do I benchmark on Metro 2033 I get a 42 FPS score on the settings I use. If I turn v sync on or off it stays the same. Now on Mafia 2 my rig will go over 60 FPS so if I benchmark with vsync on I get a score of 59.9 if its off I get like 82 or something. So in that way I can see it "increasing frame rates " Now are you talking about the card limiting the frames to a lower rate than the max of the screen used? Because sometimes when I play Crysis it will for some reason stop me at 25 FPS even though my TV I use is 60 HZ and should be capped at 60 FPS and not 25. This sucks and I have been trying to figure out how to stop this as I do get tearing artifacts badly in that game when v sync is off. Is that an example of what your talking about?
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860


It's a well-known fact that turning V-Sync off can lead to tearing. However, I get no tearing whatsoever in Crysis 2 despite having V-Sync off... as I said, instead I get much smoother fps =) I tried the same trick in Crysis and Crysis Warhead when I played it... it only helped in Warhead, with about 2-5 fps gain, so I lowered AA instead and left V-Sync on. Don't remember if I had tearing with it, though, but somehow Crysis is exactly the type of game where I'd expect a lot of tearing with V-Sync off.

You may want to try to force triple buffering via your GPU's control panel - I've heard it minimizes tearing with V-sync off... or does it get rid of V-Sync-caused fps drop while leaving V-Sync on? Don't remember which.
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310

yeah Iv tryied a bunch of stuff and it will tear without v sync and will randomly cap me at 25 FPS with it on lol Good thing thats the only game it happens with and I dont play it much anymore lol.
 

mapesdhs

Distinguished
JackNaylorPE, or indeed two 5850s or 5870ss in CF, or two *sensible* (non reference)
460s in SLI, etc. They are numerous other options which would give smooth gameplay.


Don, the results image for Highest Detail DX11 still has a typo, it says 1920x1800
when presumably it should be 1920x1080.

Must confess, it's a bit odd not to have AA results - I thought the use of AA was
pretty much a given these days for good image quality.

Ian.

 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310

+1 lol
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310


Thank you I have been saying this for some time now. It seems they never use any AA when benchmarking Crysis or Crysis 2. Great graphics with a bunch of jagged lines sounds weak to me.
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860


I'm not surprised at all. nVidia has a huge ad of Crysis 2 DX11 and high-res texture package on their website (was on the front page a few days back).

Good that I use nVidia.
 

mapesdhs

Distinguished


I'd certainly be interested in the rationale for not including AA results. Perhaps it's because, with AA active, none of the solutions
would give playable frame rates; but if that's true then it's another data point which would be useful to know, eg. for those with lesser
resolution displays, using AA is more important, so I'm sure some would find it useful to know if the upgrade + AA means only a mega
top-end GPU setup can cope.

With my older setup (8800GT 512MB), playing Oblivion/Stalker, I got round this issue by not using AA and running at a high res on
a high quality CRT (22" HP P1130, 2048 x 1536). Performance was much better, the visual differences were minimal due to the nature
of the games, and using a CRT meant the pixels were slightly smoothed out anyway, so it worked well.

Now I'm using a 24" 1920x1200 H-IPS display though (HP LP2475W), so AA is more of an issue.

Ian.

 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310
 
Status
Not open for further replies.

TRENDING THREADS