Prototype Performance Analyzed

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
this game sounds too easy. i prefer a good challenge like stalker or even l4d, but jesus, whats the fun in playin a game where your invincible? the graphics arent even great, looks like halflife2 on all low
 
[citation][nom]goalguy02[/nom]Should throw in a phenom II 940 or 955 as well as test SLI and Crossfire cards. I want to see how well the game scales using SLI[/citation]
I concur. Even if you guys didn't want to go through the hassle of all that, is it really that hard to get a hold of single cards with multiple GPUs like the X2 series or the higher-end NVIDIA?
 
yahh i would have been interested to know if this game's graphics engine scales well with multicard solution.... not like i'll ever play it though 😛

but it would have made it more interesting in the tests
 
What is interesting is seeing that a single 4850 able to pull the same result as a GTX260.

Dual GPU is now more and more a mainstream option that should be analyze way more often... bet we already know that this game largely favor ATI cards.
 
Can we get some AMD CPU scaling in this? Not all of us have the $/need/want to switch to Intel.

Maybe an all AMD/ATI platform can pull some extra frames.
 
the graphic on protptype is as good as my grandma's face . the story suck , the gameplay is clumsy , the dialog is a mess and beside the killing , there's nothing else to do . nothing in the game even make sense . suddenly , he just know how to glide , cut , slash ,absorb . by some weird reason , the virus made an imitation of alex mercer . people in the city even after being attacked , killed and seperated from the rest of the world still continue their life like normal . i dont get why even benchmark this game while there are many other games that actually ultilizing something . my pc can even handle the game on highest setting even tho it's barely playable on fallout 3 high setting without AA .
 
ITT:

Disillusioned "hardcore gamers" that watch too much Zero Punctuation and don't believe any game except their personal favorite is worth ten cents.

It's a perfect example of a game that is better than the sum of its parts.

I do second the motion for more AMD chips. Or did the weird nvidia/intel frame loss finally go away?
 
I just want to remind people, with turbo on, the i7 is actually a 3Ghz processor. So, a true showing of any i7 benefits over a 2.4 C2D would be to downclock the i7 to 2.4, and turn turbo off. Need to mention that from now on
 
bah, i'm sorry, a game needs good GFX to go with its good game play. I'm glad to see games being more cpu intensive, cause this almost always means more/better gameplay. sadly, the reason for the poor gfx is its console roots, for instance, a GeForce 250 GTS has roughly 10x the fill rate of the PS3 and 360. it was good to see both consoles come out with interesting CPU solutions that actually surpassed what was available for PCs, but for gaming, ESPECIALLY when making the push for 1080p gaming, the gfx solutions are laughably poor in both consoles.
 
[citation][nom]jaydeejohn[/nom]I just want to remind people, with turbo on, the i7 is actually a 3Ghz processor. So, a true showing of any i7 benefits over a 2.4 C2D would be to downclock the i7 to 2.4, and turn turbo off. Need to mention that from now on[/citation]

this is true. but trust me, for math performance at least, the i7 is still lots better clock for clock, core for core
 
[citation][nom]avericia[/nom]doomtomb The most likely reason you have issues with prototype is that even the most recent nvidia drivers you can get do NOT support sli for this game. So you just have to download the most recent 186 driver and then if you have evga cards you can go download the sli evga enhancement patch that includes support for prototype and hopefully it will fix that dip in fps Or you can just wait for nvidia to put sli support for prototype into its next driver version and get that if you don't want to deal with the enhancement patchgreat article toms[/citation]
Ok, thanks for the suggestion
 
[citation][nom]trkorecky[/nom]I concur. Even if you guys didn't want to go through the hassle of all that, is it really that hard to get a hold of single cards with multiple GPUs like the X2 series or the higher-end NVIDIA?[/citation]

It's not about difficulty, it's about relevancy.

How many of you guys have 2560x1600 monitors? Because a single 4870 or GTX 260 will play at 1920x1200 max settings and 4xAA with no trouble at all. They'll both even play atv 2560x1600 with no trouble as long as AA is off.

Heck, the 1GB 4870 will play 4xAA at 2560x1600 very smoothly.

What would be the point of SLI/Crossfire tests when a single $150 1GB 4870 plays the game fine at the highest settings, 2560x1600 at 4xAA? I don't know what it would prove.

If I test a game that's taxing the graphics system, then you guys'll see the SLI/Crossfire benches come out of the woodwork.
 
[citation][nom]rambo117[/nom]i would hardly call 27fps as "very playable"[/citation]

Then you haven't experienced a *minimum* fps of 27 fps.

Note minimum, not average.
 
[citation][nom]playa-wit-game[/nom]Wow. Another Don Woligroski article, another unbelievable, unexplainable, implausible win for the Core i7, by a margin too big to be believed.[/citation]

Wow. Another goof who can't wrap their heads around an i7 processor being faster than a Core 2 or Phenom II. *yawn*
 
OK. Really guys; when on earth are you going to start including the HD4890s in your benchmarking? And what about the Nvidia GT280 & 285? Where are they? Surely you have something available in your test labs, right? By the benchmarking, the high-end cards would clearly perform well enough for this game, but by how much?

It's becoming more difficult to make an informed decision based on Tom's reviews; please provide more comprehensive reports for gamers will all types of budgets.
 
[citation][nom]masterwhitman[/nom]OK. Really guys; when on earth are you going to start including the HD4890s in your benchmarking?[/citation]

When a 4870 isn't able to muster high framerates at 2560x1600 with 4xAA, I reckon.

I don't have two weeks to spend benchmarking a single title but I benchmarked a heck of a lot of cards, and I tried to benchmark the ones that mattered.

If you guys want me to drop every second card in order to include the higher eschelons then I'll be happy to accomodate, but I was trying to keep the results relevant instead of benching a whole bunch of uber-cards that offer no benefit in this CPU-limited title.
 
*sigh*

Sorry for the confusion, the above post is mine. Didn't realize my son was logged in on my machine when I replied.

Just one of the many joys of having a 15-year old around. Nothing is your own anymore.
 
Just wanted to add that I am making my way through this game currently, on the following rig:

C2Q 6600 OC'd 3.24 GHz
4 GB (2 x 2) OCZ ReaperX HPC PC8000 DDR2
2 x Seagate Barracuda 7200.11 500 GB w/ 32MB Cache
2 x eVGA GTX260 SLI OC'd to 636 / 1371 / 1107
2 x Acer X213w 21.6" widescreen monitors 2500:1 contrast ratio @ 1680 x 1050 res max

The game ran great when I first started playing, but CPU bottlenecking was obvious - if I left Thunderbird running, for example, and Growl popped up a notification, poof, instant (but temporary) slowdown.

Upon installing the 186.18 along with the eVGA SLI patch (which seems no different than the nVidia SLI patch) the game started running a bit more smoothly, but the best feature I like is that the drivers allow for multiple monitors even with SLI enabled.

The game may not be the best storyline of the year, or the best graphics of the year, or best technical design, etc...but the sum total is still one of the top titles thus far of this year.

I have a Core i7 965 EE waiting or a home (mobo and RAM) and then I can fully explore the differences (to a point - the RAM will be faster, and more than likely, as someone noted in earlier comments, I'll have the same issue as here - notably that there is more RAM on the i7 system than there is on the current C2Q system....

But as it is I experience no lag whatsoever, and this is making the game play extremely fun. It is also a game that I am going to replay a few times, and that is the true test of whether I like a game or not - re-play-ability. This is going in my cache of recent games like Mass Effect (going through it for the 4th time) and Toca's Race Driver G.R.I.D. (going through it with my 7th profile). A definite keeper.
 
Very nice benchmarks in this article although it might have been nice to see something a little more powerful than a 4870 just for a comparison.

Good work though.
 
"Wow. Another goof who can't wrap their heads around an i7 processor being faster than a Core 2 or Phenom II. *yawn* "


Wow, resorting to childish name-calling, very professional. The people who reviewed PhenomII vs. i7 when it first came out pretty much concluded that it wasn't much faster. Then you showed up late to the party with your sloppy Cyberpower review to enlighten us to just how much faster i7 really is. Now we have this article, with a CPU dependent benchmark spread unrivalled in the history of gaming. Perhaps you just have a rare talent for getting the most out of the i7, Intel should put you on their payroll if they haven't already.
 
I understand some benchmarks (3DMark Vantage) still run their tests @ 1280x1024, but is this still a main-stream resolution? I would think 1680x1050 would be a good mid-pack scale, since wide-screen is 'where it's at' these days. 1920x1080 even. A lot of us run 1920x1200, and not a few run higher than that because they can deal with larger LCD screens in their set-up (I'd have to run that 30" on a wall or something to game with, lol at least if I'm having to react fast to anything on the screen).

It does seem to make a significant difference with how well CPUs and graphics adapters perform. they seem to have different 'sweet spots'.

😉
 
[citation][nom]propaganda_payz[/nom]Intel should put you on their payroll if they haven't already.[/citation]

Brilliant observation!

Only a keen mind like yours could have seen the conspiracy; when the *Intel* Core 2 Quad demonstrates poor performance, *Intel* must be pleased as punch!

/sarcasm

Yeah, the Core 2 Quad I have is a Q6600, I mention that - it's only running at 2.4 GHz. I also mention that it's quite playable, and that if you can find a Core 2 Quad or Phenom II closer to 3 GHz it'll likely perform much more satisfactorily.

You're going to have to stretch even further outside of reality to drum up a conspiracy theory this time. But thanks for coming out! Watching you try to perform mental gymnastics is always a treat, and your posts make it pretty obvious that your motivation is a personal vendetta. 😀
 
Status
Not open for further replies.