Battlefield 3 Performance: 30+ Graphics Cards, Benchmarked

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nosleep

Distinguished
Nov 12, 2006
3
0
18,510
Sadly, this is not a good scene to gauge performance as it is not very demanding.

With that said, here are my results with a 4890 crossfire setup using the same scene and time frames:

1680x900 High : 68 fps average
1680x900 Low : 97 fps average
1920x1080 High : 68 average
1920x1080 Low : 92 fps average

Now, I say the benchmark scene is not that good for performance reasons because my fps can drop to 45 fps on other scenes. Specifically, a scene on mission one when ascending the stairs and sprinting into the small room right before you get to the roof with the sniper. I'm not sure what makes it so demanding, but my frame-rate drops like a rock every time.

Regardless, great article.
 

Ephebus

Distinguished
Apr 14, 2008
61
0
18,630
While ATI driver coders are at it, they might as well finally IMPLEMENT proper OpenGL support for their drivers, instead of the sluggish and buggy emulation layer we've had to put up with for years.
 

call3z

Distinguished
Oct 29, 2011
1
0
18,510
You say that Singleplayer doesn't care what processor you have. Is it the same in multiplayer? or will a lets say i5 760 handle it with 580 gtx sli?

Please respond!
 

nebun

Distinguished
Oct 20, 2008
2,840
0
20,810
[citation][nom]upgrade_1977[/nom]Not quite right How ports work:Games ported from console ---> PC = looks bad, plays badGames ported from PC ---> Console = looks great on both PC and better then most games on consoleBF3 was designed on PC first, and then ported to console, thats why looks better then any game i've ever seen, and it runs very, well for the amount of visuals it has.. I really can't believe it runs as good as it looks. It's very well optimized, my friend has it running (on low settings) on an old dell xps quad core with dual 9800gt's in sli, and it runs decent.[/citation]
if the game was designed to take advantage of most high end PCs then why is it that the game still looks and feels like a console game? trust me, it was designed to consoles in mind....crappy consoles if you ask me
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


An i5-760 is a little unbalanced next to a pair of GTX 580s, but that config should be plenty capable of yielding good performance.
 

bigoldbrute

Distinguished
Oct 29, 2011
5
0
18,510
Nice pile of graphic cards : ).
Btw those post effects makes picture hard to enjoy. Whats wrong with those developer guys to make such a feature turned on by default !?
 



The best I can do is show you some old benchmarks of mine from Unreal Tournament 3 when I went from the 6400+ to the x3 710 Phenom II. IT WAS A HUGE UPGRADE. My FPS with 2 x HD3870 more than doubled. I had no idea that my CPU bottleneck was so severe at the time: http://forums.epicgames.com/threads/680572-HOC-UT3-Benchmark-utility The very first post shows my 1280x720 comparison of both CPUs and I also threw in the 1920x1080 individual result with the x3 710 PH II. The result was higher at 1080p with the new(x3 710) CPU than 720p was with the old(6400+) cpu. lol. So IMO a CPU upgrade is needed first. BTW UT3 is very reliant on the CPU, so this should show you a pretty good represtentation of what you were looking for.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
Chris!- This is what i found on all three game modes today with a overclocked 9600GT.

No prob with shadows, and i get between 30-55 fps

I play at 1024x768, however so i could crank up the detail a bit:

Settings
Textures: High
Everything else: Medium
Motion Blur: Off
Ambient occlusion: Off
MSAA: OFF
FXAA: High
Anisotropic filtering: 16x
VSYNC: ON

In the Nvidia control panel, turned Triple Buffering on, rest set to "Application settings".

Cheers!
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]Breen Whitman[/nom]The 5970 beats the 6970 hand over fist. Is this a labeling error?[/citation]

Nope, it's AMD's confusing naming scheme. The 5970 is a significantly faster card.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]ojas[/nom]Chris!- This is what i found on all three game modes today with a overclocked 9600GT.No prob with shadows, and i get between 30-55 fpsI play at 1024x768, however so i could crank up the detail a bit:SettingsTextures: HighEverything else: MediumMotion Blur: OffAmbient occlusion: OffMSAA: OFFFXAA: HighAnisotropic filtering: 16xVSYNC: ONIn the Nvidia control panel, turned Triple Buffering on, rest set to "Application settings".Cheers![/citation]

None of that blockiness? Wonder if it has something to do with the High preset and a DX 10-based card, specifically?
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]cangelini[/nom]None of that blockiness? Wonder if it has something to do with the High preset and a DX 10-based card, specifically?[/citation]

or the low preset :p the only things i had high were textures and FXAA.

nope, no blockiness, it looked pretty good, considering what was running it. Like Crysis 2 in DX9 on the high preset.

i used custom settings. if i remember correctly, you used the default auto settings? That sets textures to medium and everything else to low...which is what you tested at?

you mentioned shadows causing problems. maybe turning the shadows up a notch may help? i know partially applied ambient occlusion causes odd shadows at times (the last i tried was with NFS Shift), so maybe turning that off helps? (i think it's low-quality setting was turned on with the "low" preset).

i was actually surprised when i read your results, i didn't have any problems in the Beta either. I think i used the same settings then too. either the these or the medium preset, high was sort of killing my card :D
 

fausto

Distinguished
Jan 26, 2005
232
0
18,680



Custom settings minus MSAA is not Ultra...that's custom. MSAA is expensive to run in this game. Without that you can't claim to run ultra.



what are people using opengl for these days? i can't tell you the last opengl game i bought.
 

tychot

Distinguished
May 21, 2009
4
0
18,510
[citation][nom]fausto[/nom]what are people using opengl for these days? i can't tell you the last opengl game i bought.[/citation]

ID Tech 5 and Rage.
 

tychot

Distinguished
May 21, 2009
4
0
18,510
I would be interested to know a little more about cpu performance. For instance, what cpu does it take to power dual 6990's running ultra at 2560x1600 in a 64 player game, or, how does the i3 handle 64 player games?

The reason I'm interested in cpu's is my mates looking to get crossfire 6970's to run high qual on his 2560x1440 screen in 64 player games and we need to find out if his current phenom X6 1090T is going to cut it.

Overall this has been an amazingly informative benchmark
 

fausto

Distinguished
Jan 26, 2005
232
0
18,680
[citation][nom]tychot[/nom]I would be interested to know a little more about cpu performance. For instance, what cpu does it take to power dual 6990's running ultra at 2560x1600 in a 64 player game, or, how does the i3 handle 64 player games?The reason I'm interested in cpu's is my mates looking to get crossfire 6970's to run high qual on his 2560x1440 screen in 64 player games and we need to find out if his current phenom X6 1090T is going to cut it.Overall this has been an amazingly informative benchmark[/citation]


informative about the single player...i put 0 minutes into that and only because i couldn't get into the multiplayer at first:)
 

schkorpio

Distinguished
Jun 8, 2007
7
0
18,510
i think it would have been a better idea to benchmark a more realistic representation of the game - rather than walking through a corridor 99% of the time - how often does that happen in bf3 multiplayer? ah heck all, hence the name battleFIELD.

this smacks of tomshardware being paid to make videocards look better than they are to generate sales from their site sponsors.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
BTW i noticed 800MB of VRAM usage, even at my medium DX10 settings. How much VRAM did the game use in high/ultra DX11? will >1GB cards finally make sense now?
 

nerrawg

Distinguished
Aug 22, 2008
500
0
18,990
Great article Chris, the fact that you guys bother to go in such depth to test a whole range of configs and then manage to get this article out in a timely fashion as well is definitely what puts TOMS a notch above a lot of other hardware sites!
 

nerrawg

Distinguished
Aug 22, 2008
500
0
18,990
Great article Chris, the fact that you guys bother to go in such depth to test a whole range of configs and then manage to get this article out in a timely fashion as well is definitely what puts TOMS a notch above a lot of other hardware sites!
 
G

Guest

Guest
Your test systems much suck. I get well over 70 FPS on ultra quality at 1080p with an ATI Radeon HD 6970.
 
Status
Not open for further replies.