Guild Wars 2: Your Graphics Card And CPU Performance Guide

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
Why no GTX680? Not including a GTX680 makes this article useless for me. (well its useless anyways since its a BETA build.)
 

lordtoki

Honorable
Aug 29, 2012
3
0
10,510
Great game! Got it with my new gaming laptop from http://www.getthatpc.com. I didn't expect it but I'm glad I have it. It runs FLAWLESS on max settings. You gotta love the no monthly fees! Playing with the Guardian. Can't wait to see whats next.
 

lordtoki

Honorable
Aug 29, 2012
3
0
10,510
Great game! Got it with my new gaming laptop from . I didn't expect it but I'm glad I have it. It runs FLAWLESS on max settings. You gotta love the no monthly fees! Playing with the Guardian. Can't wait to see whats next.
 
dx9 has nothing to do with consoles. They are using a heavily modified engine from GW1, which was created in the time of dx9. With that said, there's still a handful of people who are running winxp, they don't want to alienate that audience by going with dx11.
 
[citation][nom]haplo602[/nom]folks, stop the DX10/DX11 nonsense ... this is an MMORPG (sort of) that needs to run on many hw configs with most somewhere average and below average by todays standard. they'd work against themselves with DX10/11 since graphics engine is one of the last parts that make a good MMO.[/citation]

also even if they add dx11 support it doesn't necessarily mean they are going to use graphic enhancement available with dx11 such as tessellation etc. one of major point of dx11 is performance improvement and optimization.
 
[citation][nom]leongrado[/nom]Arenanet is still working on optimizing the game (I think/hope). I know I know the game was very very clunky in the first two beta weekends but at headstart the game is much smoother.[/citation]
I noticed this too. It is now MUCH smoother for me than it was in the beta. With all due respect, I've become highly skeptical of these results; perhaps not the relative placement of the various cards and CPUs, but definitely of absolute performance. Please retest at least a few combinations (perhaps using a Phenom II and a C2Q) to see how much things have changed.
 

James0224

Distinguished
Dec 3, 2008
1
0
18,510
I see a lot AMD Phenom II X4BE965 users in this forum. I currently run AMD Eyefinity at 5040x1050 playing Rift @ High vs PVE & low in raids and want to move to GW2. I have an ATI HD 5850 1GB GPU. I was going to start over with a new system but perhaps I can upgrade the GPU before I hit a bottleneck? Looking for opinions on which video card would be a viable upgrade?
 

oxiide

Distinguished
[citation][nom]Cryio[/nom]That really makes you wander, if games/programs really know to put Bulldozer to work. I think it just sits there, idling at least 50% of processor raw power.[/citation]

I don't know if you meant that as a defense of Bulldozer or not so this may not be directed at you personally. I don't think it makes them any more desirable to know that the architecture was designed such that modern applications often can't properly put it to work. If it doesn't perform in the workloads users need, its still a poor product in practical terms.
 

luxer_91

Honorable
Sep 1, 2012
1
0
10,510
here are my specs
intel core 2 quad stock 2.5 ghz
4gb ram
asus gtx 560
win 7 64 bit
what fps would i get with these specs, and at what setting
thx
 
G

Guest

Guest
Anyone have any thoughts on how the GTX 660ti might fare among these tests?
 

alextheblue

Distinguished
[citation][nom]serhat359[/nom]It will be cpu limited, you'll get around 35-40fps[/citation]
Actually we don't know that. This was a beta test and the developer said they made some modifications and tweaks to the final version. They'll probably implement further optimizations (in addition to many, many bugfixes) over the course of the first 6 months, as well.
 

alextheblue

Distinguished
[citation][nom]whyso[/nom]Where is the HD 3000 and 4000 review? I know these are only integrated graphics but given the sheer number of casual players (on laptop users-since the desktop and mobile integrated graphics are virtually the same) they should be there if only as a baseline.[/citation]I think they'll show up in the big Angry Birds benchmark. This isn't a casual game for a casual gamer. This is a demanding modern title. I don't mean to sound elitist (I play games on consoles and phones too) but seriously, those chips are NOT meant to run a demanding game without assistance from a discrete GPU. You're asking for too much. I know a lot of laptops use them, but you should research what you're buying if you play games.

If you want a budget laptop that can play games a little, you'd better invest in something with either a very low price discrete card (I know you can configure some HP laptops with a 1GB 6670M for $50) or a A8/A10 Trinity. Or both, and use dual graphics in games that have a working crossfire profile.
 
G

Guest

Guest
"Clearly, it appears that AMD's best shot at catching Sandy Bridge at 3 GHz is a quad-core Bulldozer-based chip at 5 GHz or so. Sorry, couldn't resist"
Clearly, the guy who did the benchs is such a dumb intel fanboy. I mean, what's the point on making ppl believe a pentium is better than an FX cpu, in a chart where the pentium is OCed and all FX chips are downclocked!! Just luck at the CPU frecuency chart, a quad core FX gets a hit of about 70% (or more, when downclocking it only 33% :S, something is totally wrong there), its obvious FX chips like very high frecuencies, their architecture is made for that!
 

ElMoIsEviL

Distinguished
[citation]Clearly, the guy who did the benchs is such a dumb intel fanboy. I mean, what's the point on making ppl believe a pentium is better than an FX cpu, in a chart where the pentium is OCed and all FX chips are downclocked!! Just luck at the CPU frecuency chart, a quad core FX gets a hit of about 70% (or more, when downclocking it only 33% :S, something is totally wrong there), its obvious FX chips like very high frecuencies, their architecture is made for that![/citation]

^ Look at this guy :)

May I suggest you have your eyewear checked? Maybe removing those tears from your eyes will help you see more clearly. The accusations you just made are completely unfounded.
 
[citation][nom]EzioAs[/nom]Interesting. The less-than-$100-without-external-power-connector Radeon 7750 is a balance card providing appealing visual while still runs good framerates at 1080p. I imagined if you tinker with the Best Appearance preset a little bit you can get better image quality without framerates dropping below 30. I mean let's face it, who plays on their PC without tinkering the settings here and there, that's just stupid. Great review as always! Really appreciate it[/citation]

I have tested it on a 945BE @ 3.8GHz + a single HD 5670 on medium detail @ 45-60FPS.
 
[citation][nom]alextheblue[/nom]Actually we don't know that. This was a beta test and the developer said they made some modifications and tweaks to the final version. They'll probably implement further optimizations (in addition to many, many bugfixes) over the course of the first 6 months, as well.[/citation]

I had a pair of 6850's in crossfire and it would do 1080p on High details @ 50-60FPS. A pair of 6950's would be more than enough GPU.

A fast quad core CPU will also make a big difference.
 
[citation][nom]His Nibs[/nom]How does it perform with a pair of HD 6950's in Crossfire?[/citation]

I had a pair of 6850's in crossfire and it would do 1080p on High details @ 50-60FPS. A pair of 6950's would be more than enough GPU.

A fast quad core CPU will also make a big difference.
 
[citation][nom]stingstang[/nom]I'm disappointed that this neglects the post processing bar when determining if the best appearance setting is enabled when taking in to account processing ability. I have an fx4100 and an hd 7950. How will that do at high grahhics settings?Duh, we want to know this stuff.[/citation]

Currently playing on an x6 1090T @ 4.0GHz + Single 7950 @ stock clocks. 12.8 Drivers are flakey though. With V-Sync ON at highest details gameplay averages 50-60FPS, 80%+ of the time it stays at 60FPS locked to V-Sync. It only dips to 50s in large towns and merchant areas with other players running around. HUGE events like WvW battles or big boss fights with 20-30 players on screen and a lot of particle effects/fighting going on will dip into the 40's occasionally, but still MUCH better than beta.
 

williewonka

Honorable
Aug 29, 2012
3
0
10,510
[citation][nom]alextheblue[/nom]I think they'll show up in the big Angry Birds benchmark. This isn't a casual game for a casual gamer. This is a demanding modern title. I don't mean to sound elitist (I play games on consoles and phones too) but seriously, those chips are NOT meant to run a demanding game without assistance from a discrete GPU. You're asking for too much. I know a lot of laptops use them, but you should research what you're buying if you play games.If you want a budget laptop that can play games a little, you'd better invest in something with either a very low price discrete card (I know you can configure some HP laptops with a 1GB 6670M for $50) or a A8/A10 Trinity. Or both, and use dual graphics in games that have a working crossfire profile.[/citation]

I am just asking for some benchmarks, I know what I can expect on the performance scale. I played the beta with an HD 3000 and had no problems playing it. I am just wondering if they made any improvements in the release version. Don't take this game so serious and stop thinking everything you play is only for a small group.

Not going to buy another laptop for just 1 game. This one fits all my needs just fine and I can play alot of great games. If Guild Wars 2 isn't one of them, then though luck.
 

fulle

Distinguished
May 31, 2008
968
0
19,010
Updates to GW2 and to AMD drivers have improved performance on my system by a pretty noticeable amount. In my case, FPS was increased by over 20% if I compare my day 1 FPS in GW2 vs my current FPS. Now, it could be an even larger difference, but I play with Vsync, and that little bump put me at a steady 60 FPS.

Anyway, point is that performance improved a lot since day 1, so that the numbers presented here aren't going to be very accurate. I saw significant gains on an AMD GPU, but certain Nvidia GPUs were more problematic as far as drivers go.

So, what good are these numbers now? Hierarchy comparing GPUs is compromised greatly. The benchmarks also don't reflect real world performance. CPU benchmarks lack common processors like the Phenom II and Core2 Duo....

Unfortunately, this article is in serious need of a LOT of updates. In it's current form, it gives a pretty blurry picture.

 

AMD put out GW2 drivers?
 
Status
Not open for further replies.