COD4 Graphics FPS

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

kjoshua

Honorable
Oct 7, 2013
25
0
10,530
Would I be able to play Call of Duty 4:Modern Warfare with the Nvidia GTX 660 TI on max settings (1920x1040) with a solid 250fps with no dips etc.?
 
Solution
Presumably s/he's playing COD4 because they like it, haven't finished it yet, whatever. I haven't even *started* COD4 yet,
still sitting in its box unopened, along with COD World At War, Borderlands and Red Faction Guerilla. Atm I'm still playing
Oblivion, Stalker, FC2, Crysis Warhead and Crysis2. Some of us don't have the time to blast through the games pronto
and thus be up to date, and some just prefer the older games anyway. 😀

As for performance kjoshua, it will certainly be pretty high, but as helloguy says, maybe not always 250+. Is there a reason
you need it to be so high? If you have a standard monitor then it wouldn't look any better than simply running it at 60Hz
with vsync turned on, or do you have a high-refresh monitor...
The 3D engines of some games have internal caps, but that's actually quite rare.

Just use FRAPS to monitor the frame rate, though I think there are other tools. Turn off double-buffering
to see what the GPU is fully capable of doing without sync. It's all in the CCC/NCP panel.

You can also use GPU Shark and GPU Caps Viewer to monitor other aspects of what a GPU is up to. And
of course Afterburner (I notice the latest V3.0.1 includes extra monitoring options now, including the main
CPU cores).

Ian.

 


That's extremely unusual, not something I've seen in any of the games I have, or any of
the benchmarks I use.

Normally one just changes the frame sync control, single or double buffered (or triple
in certain situations).

Ian.

 
In Call of Duty: Modern Warfare, you CAN change the maximum frames per second using 'console' which is the place where you can type commands to alter settings. You type /com_max fps 250 to change the max fps to 250. I've been playing the game for 3 years, I can promise you that you can do this :)
 


1. Such a setting has nothing to do with whether a GPU is capable of rendering a frame rate that high.

2. You will not notice the difference for any frame rate that is higher than your monitor's maximum refresh.
I've already asked what your monitor's refresh rate is, but you still haven't said. All that happens is that, if the
GPU is exceeding the monitor refresh, then you'll see tearing effects, but if the GPU's fps never dips below the
monitor refresh rate then the maximum possible quality one can have is just with double-buffered mode active.

3. Your initial query referred to a minimum rate of 250, but as I've explained, it's a waste of time trying to achieve this.


A setting like that in a game just doesn't mean anything. I'll say this once more: if your GPU can render a
minimum rate that is always higher than your monitor refresh, then you will never achieve a visual onscreen
quality which is better than simply setting the gfx mode to double-buffered. Naturally if your monitor is set
to 60Hz just now, but it is capable of 72 or something higher, then set it to the higher rate, assuming your
GPU can match. But trust me: trying to achieve 250fps if your monitor can't do the same thing, and if your
GPU can't either, is a complete waste of time. Not entirely sure why you're ignoring advice from people
who know what they're talking about... ;D

So what model is your monitor then? As I say, if you have a standard 60Hz flat panel, you will never notice
any difference above that rate in double-buffered mode. If you spend more money on a higher GPU just to
try and get beyond that then you're wasting your money. The only reason why it often makes sense to use
single-buffered mode is because all too often a GPU isn't able to consistently deliver a minimum which is
above the monitor refresh, in which case using double-buffered mode can collapse the fps from 60 to 30 if
the backend engine drops below 60 (this is for a 60Hz monitor I mean). Thus, people tradeoff the apparently
higher frame rate in exchange for the onascreen tearing which occurs in single-buffered mode.


But let's assume you do have a newer 120Hz or 144Hz monitor, then the same thing still applies: aiming for
a fps rate beyond the monitor refresh set to double-buffered mode is pointless. I don't know why you're so
fixated on this 250 thing, because honestly dude, it doesn't mean anything. 😀

I was a VR/displays admin for 3 years (in charge of a RealityCentre and the first CAVE in the UK), I know
about refresh rates; I can promise you what I've said is true. :)

Ian.

 
Btw, checking my stack of games, it appears I do have COD Modern Warfare, still unopened (not finished other
games yet). Thus, I'd be happy to try out the game on a couple of different newer GPUs which should give you
some idea of how a 760 would perform. One can infer expected performance only to a certain degree from
review sites, because newer GPUs tend to focus on boosting performance for newer features, not older games,
which is why - for example - a couple of GTX 280s is still such a strong solution for any strictly DX9 game.

Anyway, let me know, I have the box in front of me. I can easily test with a GTX 580 and a 1GHz 7970, from
which you can cross check no problem with toms' performance charts to guage 760 performance. All I need
to know is what settings you like to use (resolution, detail, AA, AF, etc.) I can test up to 2560x1440, and
I can test up to 3-way 580 SLI or 2-way 7970 CF.

Ian.

 


I hate to say it but that's a basic misunderstanding of how game engines and 3D settings work. If the
display is set to single-buffered then of course the frame rate can massively exceed the display refresh.
All that's happening is that the rate is so high, it's keeping the visual tearing to a minimum (it would look
much worse if the rate was only 45).

My point is that if you have a 60Hz monitor, and the frame rate is never dropping below that, than you
physically will not be able tell any quality difference better than just having the display set to double-buffered
mode. In other words, right now you're using single buffered with an unlimited refresh; as long as you're
sure the rate will never drop below 60, then it will look better set to double-buffered mode, that's a basic
fact of how displays and gfx engines work.

Hence, if you really want to see a better visual quality, then get a 120Hz monitor, up the GPU power to
ensure it cannot drop below 120fps, and set it to double-buffered.


Btw, I do know where you're both coming from with this. I play older games with what used to be the
previous gen high-end GPU tech (atm, Oblivion, Stalker, Crysis, FC2, that sort of thing). I like to play at
high-res with absolute max settings. I used to have a 2048x1536 CRT, which had a dot pitch so high
that to some extent AA wasn't really needed. Eventually though I switched to a 1920x1200 60Hz IPS
panel. Anyway, I wanted very high minimum refresh rates, but my original GPU setup (8800GT) wasn't
fast enough for that, so for a while single-buffered it had to be. Quite often the frame rate would be
well above 60 of course, eg. indoor areas, that sort of thing, but often it'd be down in the 30s.

However, I upgraded to 8800GT SLI, then to GTX 460 SLI, then to GTX 560 Ti SLI, and now I'm using
GTX 580 3GB SLI (faster than a GTX 780).

So, don't worry, I'm not misunderstanding what you're saying - the Stalker tests I've done on 7970 CF
are achieving over 1000fps in some configurations. 😀 Similar thing happens with PT Boats and X3TC,
though the latter is CPU bound.

Ian.

 
soundind like high tek sales hype I got like 5 monitors and a hdtv so at all the same settings nothing really changes between all them lcd, flat screen ,crt [witch I feel does the better job and its like 13 years old I'm on now] havind all I use all I see on that is mobo jumbo
 


As I mentioned before, I use FRAPS, Afterburner, etc. There are numerous utilities which can display the current
frame rate, log this during a session, etc.

Ian.

 


Not my problem if you don't understand the technical aspects of display technologies. 😀 I do, as it was
my job for a while (and my dissertation research area before that in 1993/4); I've followed it ever since
as I've been involved in the movie biz on the SGI side since then, and now with PC tech. I helped out
with a number of productions, including Lost World, SW2, etc., though heck dammit they didn't stick me
in the credits. :}

Btw, CRTs can still look rather good because decent models were capable of high refresh rates (mine
could handle 96Hz), the phosphor persistence is low (minimising blurring, unlike many flat panels) and
the dot pitch is often very high. Trust me, I know, I still have two thumping great Sony FW9100 24" SGI
superwide monitors. 😀 (they were about $2500 each when new)

My CRT's colour clarity went fubar though, so I switched to a flat panel eventually, but I chose IPS in order
to have better colour quality and much wider viewing angles. That was some years ago; IPS tech is much
cheaper now.


Anyway, I have to get back to work now. I've posted the facts. Up to y'all if you choose to ignore them,
but as I say kjoshua, I'm happy to run a couple of tests if you like (PM me as I'm halting notifications on
this thread).

Ian.

 
I'm not being funny, Ian, but I'm asking which graphics card can run COD4 at a higher frame rate than 250. I don't need to give a reason as to why I need this, and as it stands it's because I play competitively and a change of fps can allow different glitches to be done. 125fps allows for a longer jump, 250fps allows for a higher jump, and 333fps allows for a jump in between. Don't bother trying to jump down my throat about how I'm 'ignoring information people are giving me.' How about you answer the question instead of turning this into a discussion, yes? I've been playing competitively for 2 years now and I think a change in max fps will give me a good advantage to playing promod and/or MODS duch as DEATHRUN wich involves strafe jumping. So don't go on about how I need a better model of monitor otherwise 'there is no point' because there is. Thank you very much, and if you disagree, remove yourself.
 
well just get a mac daddy card and frag on I would assume if you go to there server like steam they will cap it in the name of fair game play now if you run your own server then speed kills. just like in the good ol days if your rig sucked you did not get too far now with steam servers its set so all can play on equal ground so your $1000 rig is playing right with my $500 rig [opinion]
maybe they sort you out to match you up on a server I don't know seeing I don't support steam in any way I run my own server