World Of Warcraft: Cataclysm--Tom's Performance Guide

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

scrumworks

Distinguished
May 22, 2009
361
0
18,780
[citation][nom]punnar[/nom]Thank you Toms. Now I know what has to be done.nVdia, you win this time around.[/citation]

GTX580 being 22% faster than HD5870. I guess you can call that a "win".
 

haftarun8

Distinguished
Jun 20, 2006
83
0
18,630
Is it still possible to add Ambient Occlusion on Nvidia cards via the control panel? If so how does the game look with it on, and how does it affect performance? I would imagine the pathetically low poly count of the game would make for little benefit of ambient occlusion, but it depends on how it's implemented.

In other thoughts, I wish they would add CryEngine-like motion blur as well as depth of field (and not just blurring the background, but i.e. actually narrowing focus on targeted objects during combat to add a nice foreground bokeh lens blur, when you zoom in close to your character, etc... more how a real camera works). And how about higher res textures, waaay higher poly count, and perhaps tesselation. Blizzard, are you listening? It's not 2004 anymore...
 
[citation][nom]Wheat_Thins[/nom]Whats the explanation behind the AMD deficiency?[/citation]
Overclocked core i7 980X at 3.73GHz vs stock Phenom IIx6 at 3.3GHz? Take the IPC into account and there is a pretty huge difference.
Still, a single high clocked intel core matching 6x phenom cores?
Perhaps they did a good job at programming for hyperthreading.
 

Mr_x

Distinguished
Oct 9, 2010
14
0
18,510
Throw in an Phenom II x4 and then lets do the talking shall we
Im not biased but Phenom II X6 s**ks all benchmarks i have seen it takes a hit its just not right
 

dgingeri

Distinguished
[citation][nom]enzo matrix[/nom]Overclocked core i7 980X at 3.73GHz vs stock Phenom IIx6 at 3.3GHz? Take the IPC into account and there is a pretty huge difference.Still, a single high clocked intel core matching 6x phenom cores?[/citation]

Clock for clock and core for core, AMD's current chips are slower than the older Core 2 chips. It really shouldn't be a surprise that they lose out.

However, there is also the significant problem with cache. The Core i7 chips have 8MB or 12MB of cache, and the Core 2 Quads have either 8MB or 12MB, compared to the 6MB on the Phenom X6. this is a big problem with WoW. It needs a lot more data in the cache to perform at optimum.

On top of that, there are certain instructions that the AMD chips just don't support, and others that they just don't support well.

AMD is just on the losing end until they get their chips redesigned and get a decent level of cache on them. They ruled the market when they had superior caches (remember the P4's 16k L1 cache compared to the 64k Athlon cache?) and now they don't. They need a redesign badly. I hope Bulldozer is enough, and not just a server level chip.
 

darkblade6

Distinguished
Dec 6, 2010
1
0
18,510
just wondering where were you when u were benchmarking cause i can tell you some place to go that WILL DROP BY A SHIT TON the average FPS of this benchmark ..... like Orgrimmmar at 7PM or just Ashenvale entrance (between barrens/ashenvale)
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]darkblade6[/nom]just wondering where were you when u were benchmarking cause i can tell you some place to go that WILL DROP BY A SHIT TON the average FPS of this benchmark ..... like Orgrimmmar at 7PM or just Ashenvale entrance (between barrens/ashenvale)[/citation]

The benchmark sort of had to steer clear of other players or unpredictable mob spawns, or they aren't repeatable - so busy places like ogr or dala aren't an option
 

nekromobo

Distinguished
Jul 17, 2008
110
0
18,680
[citation][nom]Nekromobo[/nom]What an horrible article. Toms has gone down down down I say.Any idea to check cpu utilizations behind the fps ? WoW seems to be horribly coded is the only conclusion for this article. I bet enabling DX11 would help any GPU out there with its fixes to multicore support(check this?)I bet Blizzard would want to help a high profiler like tom's hw[/citation]

So the benchmark is hard doing 50% utilization for cpu's (as dgingeri points out) and then we can compare Intel and AMD cpu's, not very valid bench.

Sid Meier commented civ v multithreading: "The internal architecture of the Civilization V graphics engine, however, is heavily multi-threaded and users will see multi-processor benefits even with drivers that are not threaded (including DX9). "

If there's some comparison to WoW than dx9 is also handicapping multicore's. I am generally wondering why we are throwing cpu-cycles into problem that isnt a cpu-gpu-cycle problem (what game uses 100% of gpu besides furmark(and its not a game even:)?).

I think im having valid points, let the flames go higher.
 

nekromobo

Distinguished
Jul 17, 2008
110
0
18,680
oh and here's the url to full Sid on civV on dx11http://www.pcgameshardware.com/aid,776086/Civilization-5-Tech-Interview-What-DirectX-11-is-good-for/News/
 

agnickolov

Distinguished
Aug 10, 2006
520
0
18,980
Looks like the performance of the memory controller is the second biggest factor after the graphics card - not the number of cores or the CPU cache (though the latter might help a bit). Core i7 9xx beats Core i7 8xx due to 3x vs 2x memory channels addressing. Core i7 8xx beats Core i5 6xx due to on-chip vs on-die memory controller. This is also the answer to AMD's poorer performance - it's memory controller is not up to par to Intel's Core i7 8xx dual-channel mode memory controller. Extrapolating from this, Intel Core 2 Duo/Quad with its FSB should be decimated here. My advice for Intel Core 2 Duo/Quad owners is to go for a platform upgrade.

Finally we have a game that puts a strong argument for going with Core i7 9xx compared to Core i5 7xx / Core i7 8xx!
 

agnickolov

Distinguished
Aug 10, 2006
520
0
18,980
Adding to my previous post, it'd be interesting to see the results of memory overclocking using Core i7 980X on the frame rate. Also comparing Core i7 980X to an entry level 9xx like Core i7 920.
 
G

Guest

Guest
I have just switched my wifes wow computer from an AMD X II 4800+ to an i3 540 with 4gb of 1333 ddr 3 memory. Hope it raises her 43 fps to something more useable. While an i5 seems to be your test bed, couldn't afford one of those and certainly not an i7. I am running an older AMD X II 6000+ on ultra setting and still get up to 60 fps. Waiting for that i7 price drop, or unplayability with cata.
 

futureyes

Distinguished
Dec 9, 2009
8
0
18,510
I hate to point out what might be painfully obvious, but was vertical sinc turned on with the AMD system? Sounds fishy to me the both the GPU's hit the same "60"FPS which is very typical when vertical sinc is on................
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]futureyes[/nom]I hate to point out what might be painfully obvious, but was vertical sinc turned on with the AMD system? Sounds fishy to me the both the GPU's hit the same "60"FPS which is very typical when vertical sinc is on................[/citation]

Check out the logging chart of the frame rates for the AMD machines--they peak at 90 FPS or so, but the lower mins are what drag the average down to 60ish FPS.
 

Phoenixlight

Distinguished
Sep 2, 2009
123
0
18,680
Well since 96% of people are using monitors with a 60hz refresh rate the fact that the intel CPU's are getting over 60FPS doesn't matter. 25 FPS is the minimum needed to play smoothly so anything over that is just great.
 

Kodiack

Distinguished
Oct 8, 2010
24
0
18,520
[citation][nom]phoenixlight[/nom]Well since 96% of people are using monitors with a 60hz refresh rate the fact that the intel CPU's are getting over 60FPS doesn't matter. 25 FPS is the minimum needed to play smoothly so anything over that is just great.[/citation]

25 FPS is terrible, especially on higher-end systems. With WoW, I start to notice camera movement being a bit jerkier in the 45-50 FPS range. Those small dips are very noticeable.

I must say, I'm extremely disappointed with how CrossFire/SLI scaling shows up in this article. I'm a Radeon 5970 owner, and it's been great. I haven't been able to reproduce results like that, though. With D3D11 running in fullscreen mode, CrossFire seems to boost FPS more than it hinders it. I'd be interested in some information on that. I ran some benchmarks between D3D9/11 and CrossFire enabled/disabled back in the alpha, but ATI's drivers weren't agreeing with the game engine overhaul at that time.

An interesting article that shed quite a bit of light. I'm still surprised at how Nvidia seems to beat out ATI in almost everything. A GTX 460 outperforming a 5870? That just doesn't seem right. Hopefully some driver updates even things out a bit more.
 
G

Guest

Guest
I wish the "anything over 24 (or 30) FPS doesn't matter" myth would die, die, die, already. That only works for movies because of things like motion blur, repeated frames, etc. Pretty much *anyone* can see the difference between 60 and 30 FPS in a real-time rendered video game.
 

JustinHD81

Distinguished
Mar 31, 2009
39
0
18,530
I'm a little disappointed that the "AMD Mainstream" graphics cards are all HD4xxx series cards, why not a HD5670, HD5570 etc,
 

dgingeri

Distinguished
[citation][nom]bugleyman[/nom]I wish the "anything over 24 (or 30) FPS doesn't matter" myth would die, die, die, already. That only works for movies because of things like motion blur, repeated frames, etc. Pretty much *anyone* can see the difference between 60 and 30 FPS in a real-time rendered video game.[/citation]

I don't know about other people, but I definitely notice when I'm watching TV on a 120Hz TV. It looks so much smoother and more lifelike. Higher fps works for everything. Those complaining about "not noticing over 24/30 fps" are just fools, no matter the arena.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]Kodiack[/nom]25 FPS is terrible, especially on higher-end systems. With WoW, I start to notice camera movement being a bit jerkier in the 45-50 FPS range. Those small dips are very noticeable.I must say, I'm extremely disappointed with how CrossFire/SLI scaling shows up in this article. I'm a Radeon 5970 owner, and it's been great. I haven't been able to reproduce results like that, though. With D3D11 running in fullscreen mode, CrossFire seems to boost FPS more than it hinders it. I'd be interested in some information on that. I ran some benchmarks between D3D9/11 and CrossFire enabled/disabled back in the alpha, but ATI's drivers weren't agreeing with the game engine overhaul at that time.An interesting article that shed quite a bit of light. I'm still surprised at how Nvidia seems to beat out ATI in almost everything. A GTX 460 outperforming a 5870? That just doesn't seem right. Hopefully some driver updates even things out a bit more.[/citation]

The official word from AMD is that CrossFire works fine (it clearly doesn't). The more plausible explanation comes from Nvidia, which says that SLI scales in SOME of the environments in WoW, but not others. This is a bug, which should be addressed in a coming driver. If AMD is encountering something similar, it'd make sense if our particular benchmark sequence results in negative scaling.

I actually shared my results with both AMD and Nvidia prior to publishing. AMD was unable to explain the CrossFire results and the poor CPU performance. If I hear anything more from them, I'll absolutely update the piece!

Thanks for the feedback everyone,
Chris
 

psiboy

Distinguished
Jun 8, 2007
180
1
18,695
Would have been interesting to see an AMD Phenom X3 or X4 to see whether the L3 cache was more of an effect than the number of cores.... gross oversight perhaps?
 

gamedev

Distinguished
Mar 19, 2010
6
0
18,510
Pretty confident about a few things regarding the AMD versus Intel results:

1) There will be some Intel branding program with this game ie. co-marketing program.
2) There was work done at the optimization/compiler level to favor Intel cpus.
3) An update will eventually come out that closes the gap by a fair margin.

I am not an AMD fanboy but it just makes no sense that a 6 core 3.3ghz part (even when overclocked) cannot match a dual-core i3 part.

It must be due to some optimization or lack thereof that relies on the vastly superior memory controller/bandwidth versus code execution ability.
 
Status
Not open for further replies.