StarCraft II Beta: Game Performance Analyzed

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

doomsdaydave11

Distinguished
Oct 16, 2007
935
0
18,980
0
I hope they include AA in the final version. I can't stand those jaggies!
Performance of this game looks great so far. I think I'll be able to max it @ 1680x1050 with my 2 year old E8400/HD4870 setup. Very happy about this :).
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790
1
I would like to point out that I played the beta yesterday for about 2 - 3 hours or so and never dropped below 60FPS with my system.

The game was maxed out with every setting available to me and at 1920x1080 res. This is on a 3.8ghz core i7, currently 4gb of DDR3 mem, and a slightly OC'd 1gb 5870 on win 7 64bit.

The numbers here seem odd to me.

Best,

3Ball
 

buddhist2k

Distinguished
Mar 9, 2010
10
0
18,510
0
[citation][nom]lljones[/nom]Nice review, never played the original, will have to give this a try. I'm tired of run and gun.[/citation]

I have to agree... im getting bored with FPS's. A little strategy is great. Starcraft I was awesome... im looking forward to the new one. I wish they would develop a World of Starcraft! That would be interesting.

For now... its back to CoD - MW2
 

algorian

Distinguished
Apr 28, 2010
2
0
18,510
0
Correction:
"... Protoss warp in their structures with drones"
not with drones, probes!

"Zerg creep no longer grows out of most buildings and must be constantly replenished by overlords".

Actually, It does grow out of every building except extractor.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
9
WOOT! A BIG thumbs up to Tomshardware for talking "Radeon vs. GeForce" image quality! While I'm sure the static images in the article don't tell the whole story, I consider this a major leap in video adapter reporting.

Keep up the awesome work!

;)
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
9
Lol I made my previous comment before I read the article. I see now you compared ATI's current generation GPU with Nvidia's last generation.

I've actually made this comparison myself. I had a GTX 280 fail on me, and while it was (is ) in for warranty repairs, decided to give 2 Crossfired 5770s a shot (because that was about the same price as 1 GTX 285). The game is World of Warcraft.

What I noticed (other than higher frame rates from the Crossfired cards) was that transparent objects looked better on the Radeon cards. Frankly, I put this down to DX11 and WoW's ability to take advantage of at least that much of it (but little more). However, the AA on the 5770s was terrible - and this is on a CRT (Sony FW900, 1920x1200) at the same settings as the GTX 280. I haven't tried it yet on my LCD, since LCD panels have even more trouble with AA than CRTs I'm thinking the AA might not be fixable in WoW whit the 5770s. I'd have to live with jagged lines that should be straight.

Otherwise, both pictures are excellent, except the GTX 280 produces a better blended overall background. I'm talking things like large fields of grass, walls, trees - these things look more real because the individual segments they are rendered from are better blended.

Ambient Occlusion makes a bit of a difference, too. AO gives current games a bit more depth. Last generation Nvidia cards have this, current generation ATI cards don't.

My conclusion (from obviously very limited experience) - if frame rates and DX11 are the most important thing to you, for about $300-350 go 2 5770s in CrossfireX. If overall visual quality is what is most important, go for the GTX 285 - scratch that, you can get a 470 for that, there's no reason to buy a 285 anymore, unless the price drops way down.

WoW's graphics are described as "CPU intensive"; I imagine that games that rely more heavily on the video adapter will show up their visual quality differences even more.

ATI, in my opinion, has spent a couple of years taking the recent AMD philosophy of trying to keep the price down and compete that way rather than doing it by keeping their technology competitive. I am glad to see a shift in their thinking with the Phenom II series and hope that trend continues in both their CPU and GPU product development. We consumers do need price competition, and I don't want AMD to go unrecognized for helping keep prices down, as much as they have, but Intel and Nvidia respectively need to be technologically challenged. Without AMD and ATI striving to overtake them rather than just keep up with them, they have no real competition.

Ah, for the good ol' days when Matrox was in the mix.

;)
 

cleeve

Illustrious
[citation][nom]Marcus52[/nom]Ambient Occlusion makes a bit of a difference, too. AO gives current games a bit more depth. Last generation Nvidia cards have this, current generation ATI cards don't.[/citation]

Not quite true. Radeons have supported ambient occlusion since the first DirectX 9 cards came out if I'm not mistaken. Radeons support AO in all sorts of games, StarCraft II included.

HOWEVER, Nvidia has a proprietary AO algorithm that can be forced through their drivers in certain games that do not have AO support in the engine. The caveat here is that Nvidia has to code support for this into the driver on a game-by-game basis, so there aren't too many titles that this works on.
 
G

Guest

Guest
@ lljones: I am pretty sure the game is DX9. Blizzard has never been one to require monster graphics cards.
 

raj072

Distinguished
Sep 8, 2009
38
0
18,530
0
I have the starcraft beta and 3 keys. I have 3 PCs networked

All PCs have 2 gigs ram
1. Single Core A64 venice 3.2 ATI XT800XTPE (256 meg)AGP
2. Celeron 1.6ghz (its a core 2 single cpu) 8600 GT PCIx
3. Dual core E5300 with ATI 5770

The game runs butter smooth on the dual core

It can run smooth on the single cores but the game tends to have lags or total freezes of 5 seconds then returning to normal speed.

It seems to be more cpu dependant than graphics.

Something is messed up with their programming as there is no way it should do that on a fast single core with what visuals you are getting.

Its like they optimized for dual core and forgot about single cores.

Then I read a thread that another guy noticed the same thing, weird lags during the game on single core. he said the problem was with nvidia physx. That by installing the latest version fixes the single core lag issue. This is to be done on ATI or Nvidia systems.

He also said "If you are still having this problem, add -skipopenal -nocpubinding to the Target field of the StarCraft 2 shortcut and try again."

I'll try the phyx but I don't think I even have that installed on the single core systems at all. Have to give it a shot and see.

The game needs serious optimizations for single cores, its not that demanding of a game yet it sucks power like its crysis. Poor or lazy programming is my guess.

 
G

Guest

Guest
How about benchmarking it on even older-generation graphics card and mobile graphics chips? I spent nights playing Starcraft on a Sony Vaio with a Pentium III 600 MHz, 96 MB RAM and whichever ATI Mobility Radeon (or was it still called ATI Rage?) was common then. Would love to know if I can get any kind of playable performance on my current laptop, an HP 8530p with Core2 Duo 2.4 GHz, 2 GB RAM, ATI Mobility Radeon 3650, 15" 1680x1050 LCD. Given the benchmarks on the (desktop-class) Radeon 5570, I know I can't expect playing with High or Ultra details, it's doubtful I could with Medium details, but maybe low details? Thanks!
 

raj072

Distinguished
Sep 8, 2009
38
0
18,530
0
It will play fine on a dual core laptop with 2gigs. It does not need a strong graphics card. I can max it on a 8600Gt. I'm going to buy a cheap e5300 for that system to get the play better.

I'm going to install it on my 1.4ghz 1.25 gig ram Dell laptop. Inspiron 600m to see if it will play.

 
G

Guest

Guest
as far as people accusing blizzard of lazy programming, this is an RTS, and is subject to way more CPU demands than any FPS.
Hell, if you know which problems are CPU intensive, you can lag even a modern CPU at 4Ghz on the OLD starcraft, despite that game being quite playable on a 233 Mhz processor.

For example, movement of 1000+ units with conflicting pathfinding.

FPS are simply not subject to the same kinds of combinatorial explosion.

I have a feeling the difficulty with utilizing more cores, is keeping gameplay deterministic. (for the sake of replays and to avoid sending keyframes over the network)
 

ezareth

Distinguished
Dec 15, 2005
40
0
18,530
0
I play on max ultra settings at 2560X1600X32 Bit and never see any latency or choppiness whatsoever even during massive battles. FPS is irrelevant in RTS.

I only have 32Bit XP Pro machine with a 2.66Ghz I7 and a 4870X2 with 6 gigs of memory (less than 3 Gigs available to the OS).

If my high end 2 year old machine runs this game completely maxed out people really shouldn't worry about the performance of it.
 

kiza12

Distinguished
May 8, 2010
1
0
18,510
0
# Operating system:Windows 2.6.1.7600 ()
# CPU type: Intel(R) Pentium(R) 4 CPU 2.66GHz
# CPU Speed (GHz):2.68
# System memory (GB):1.999
# Graphics card model:ATI Radeon HD 4550
# Graphics card driver:atiu9pag.dll
# Desktop resolution:1280x1024

I have this. The game runs perfectly in low/medium but in big battles is lagg like hell and the frames go to 0-5. There other way to improve performance of SC2 instead of buy a new CPU?
 

zeroabg

Distinguished
May 10, 2010
26
0
18,530
0
"I play on max ultra settings at 2560X1600X32 Bit and never see any latency or choppiness whatsoever even during massive battles. FPS is irrelevant in RTS.

I only have 32Bit XP Pro machine with a 2.66Ghz I7 and a 4870X2 with 6 gigs of memory (less than 3 Gigs available to the OS).

If my high end 2 year old machine runs this game completely maxed out people really shouldn't worry about the performance of it."

You just called a core i7 old... I'd say that makes your your opinion irrelevant but of course your fps is always relevant, maybe not to you since you have a nice system but it is to most people here. on a side note, that expensive system and you have a 32 bit os?
 

zeroabg

Distinguished
May 10, 2010
26
0
18,530
0
"My son has been playing and getting occasional

"Display driver nvidia windows kernel mode driver version 197.45 stopped responding and has successfully recovered"

error message. Anyone else ? .... used to get this occasionally in other games and even programs but it all went away after an i-Tunes update. Suspecting the most recent new i-Tunes update as the cause again but figured I'd check to see if anyone else experiencing same problem before trying to isolate."

that sounds like an overheating video card, get speedfan and check it out.
 

togusa9

Distinguished
May 29, 2010
3
0
18,510
0
Did anybody try to run these tests on a portable computer ?

How would these benchmarks on a i7-920 translate to a portable computer with a i5-430m with hd5650 or gt330m (4 gpps 8gtps) please ?
 

cleeve

Illustrious
[citation][nom]togusa9[/nom]How would these benchmarks on a i7-920 translate to a portable computer with a i5-430m with hd5650 or gt330m (4 gpps 8gtps) please ?[/citation]

CPU is probably comparable to a Phenom II X4 @ 2.5 GHz as benchmarked with a Radeon HD 5850.

Of course those graphics cards will be slower than a 5850, but the benchmark was taken at ultra detail. An i5-430m with hd5650 or gt330m should be able to handle 1280x800 at medium details without a fuss.
 
Status
Not open for further replies.

ASK THE COMMUNITY