Far Cry 3 Performance, Benchmarked

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
army_ant7

I remember a rumour once that due to L1 cache misses, Bulldozer kept going back to main memory. The L1 cache is stupidly fast, but ineffective. Steamroller will increase the amount of L1I cache but I don't think there's any plans to increase the L1D cache which is only 16KB and shared between both cores within a module. Piledriver did improve on L2 latency, however.

Steamroller should significantly help with sorting out the poor gaming performance, not that it's particularly poor here.

king smp

I get what you're saying, and to be honest, I'm not really sure why the i7-3960K is overclocked. I'd be more tempted to overclock the lower processors, though the 955 is unlikely to be one of them considering its age. I'd have thrown the 980 in here in its place, however.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]tourist[/nom]WRONG WRONG WRONG a6 is a 4 core cpu with 6530d graphics and it does support dual graphic with the 6570 6670[/citation]
The A6 has a 2 and 3 core variant as well, maybe that's his source of confusion.
 

dscudella

Honorable
Sep 10, 2012
892
0
11,060


A6-3500 HD 6530D Tri Core
A6-3600 HD 6530D Quad Core
A6-3650 HD 6530D Quad Core
A6-3670K HD 6530D Quad Core
A6-5400K HD 7540D Dual Core

As you can see EVERYONE is correct. Depending on the A6 model you can get Dual, Tri or Quad Core.

The Radeon Models that can be crossfired with the A6 are the same as the A8 & A10. They are the HD 6450/6550/6570/6670
 

dscudella

Honorable
Sep 10, 2012
892
0
11,060


Alright Tourist, this is going to be my last reply to you. Drop it. That was a discussion that turned ugly. It's over. You're attempting to carry this over into a different day and I will have no part of it. The point was that an A6 alone could not play modern games at decent framerates and that point was proven. If you Crossfire'd the A6 with a 6670 then yes, you would see playable framerates. You didn't even attempt to make that point. Which if you did, I would have given it to you.

It's over, the conversation is dead. My offer was not a farce and I meant it. I've been wrong before and I cop to it. I don't hide behind vulgar language or obnoxious behavior, I simply admit I was wrong.

Enjoy the rest of your day and weekend.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
About the L1 cache misses... Wow! Talk about the "not so obvious." Thank you for sharing that, because I never thought of that possible way memory bandwidth could be compromised in such a manner and based on my understanding of the matter, it sounds reasonable (Did I understand you correctly? I mean, was this in fact one of your points?) aside from of course the added latency from L1 cache misses. :)

About the i7-3960K being overclocked, I believe they always do that to try to ensure that the CPU would not or at least barely serve as a bottleneck. :)

Come on now tourist... Don't mind the previous posts anymore. I think the debate about whether that build would game or not is productive, but maybe to help your points, you could maybe present benchmarks or form logical comparisons (i.e. relate) data from the article or elsewhere that may prove or disprove if it would game or not. Just a suggestion of mine... :)

Addition: And don't play the "troll accusation" card quite yet... I sincerely don't feel that dscudella is trolling. If ever you feel that though, you could provide data that would disprove him/her. :)
 

taiso

Distinguished
Aug 7, 2008
48
0
18,530
Im surprised no one has mentioned this in the article or in the comments so far but the thing that really seems to affect performance is the POST FX setting. You can achieve good frame rates and have everything else set to ultra with a 6950 and a Phenom II @ 3.2 GHZ if you simply lower this one item. Personally I dont find that it makes a big difference in image quality but setting everything to ultra with 4X AA and POST FX I get around 45-60 FPS depending on different firefights and situations, its very smooth overall. Once I change POST FX back to anything higher the FPS takes a plunge. Just a thought. Oh and also the new BETA AMD driver+CAP seemed to help the frame-rate compared to when I first installed the game.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
[citation][nom]army_ant7[/nom]About the L1 cache misses... Wow! Talk about the "not so obvious." Thank you for sharing that, because I never thought of that possible way memory bandwidth could be compromised in such a manner and based on my understanding of the matter, it sounds reasonable (Did I understand you correctly? I mean, was this in fact one of your points?) aside from of course the added latency from L1 cache misses. About the i7-3960K being overclocked, I believe they always do that to try to ensure that the CPU would not or at least barely serve as a bottleneck.[/citation]

The L1D cache is far too small and needs more associativity, at least according to various opinions that I've seen. A fast L1 is only good if it's accomplishing something. From what I've read here, there's a lot of cache writing going on. However, fixing Bulldozer in this method looks like a complete redesign.

For a second, let's say that FC3 is limited to four cores (logical or otherwise). HT would be a major disadvantage here, but it seems the Intel CPUs are able to hit the GPU limit. Judging by the i3-2100's performance, overclocking the 3960K here is meaningless. Dropping to low details - or overclocking the graphics card - would alleviate the GPU bottleneck (you'd hope, considering this is a 7970) and as such we could potentially see a situation where the i3 results only improve a little, and the i5 takes the lead due to its four physical cores against the i7's HT setup which would make more sense to disable, especially considering the software can see 12 cores but will only work with four, HT or no HT. Even overclocking heavily here wouldn't bring the 3960K on par with the i5-3550. So, the 3550 and an HT-less 3960K would be far in front of the i3 which barely leads the 8350.

Any chance of a low details test please, Toms? :)
 

mohit9206

Distinguished




i was specifically talking about the trinity A6 APU not the now defunct Llano variants
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Dang... I have forgotten what associativity (and n-way) means for cache. I'll need to read up more again. Thanks for sharing that link and info though. :)

Here's the thing with overclocking the i7-3960X... They wouldn't have known if there would or would not be a CPU bottleneck going into benchmarking the graphics cards, so it's pretty reasonable that they took extra precautions to try to assure that the CPU won't interfere with the results (much). :) Also, I didn't pay particular attention to if the i7-3960X was overclocked still for the CPU test (if it were, it saved them from retesting it at stock clocks, hehe...), but they might've tested CPU's from "top to bottom" in which case they wouldn't have known that the i3 was enough to reach the graphics bottleneck, but even if they started from "bottom to top" it was still worth testing as sometimes anomalies could be discovered this way. At least this is what I think... :)

I would've liked to have seen the CPU test done on low details as well, as I've expressed in a previous post.

Also, BTW, what makes you guys suspect that it's limited to (effectively) using just up to 4 CPU threads?
 
G

Guest

Guest
Ah, good to know that my FX8350 is slow, but thankfully the Far Cry people have screwed up their game to where it's somehow competitive with the i3, which apparently should always win every benchmark no matter what... Somehow I hadn't figured out that it was slow yet, don't know how I missed that one.

Obviously any PC built to run Far Cry 3 wouldn't be used for anything else either. For that matter, even though nearly all games are GPU limited, we shoud still pretend that it matters if the Intel CPU gets 110FPS and the AMD CPU only 100FPS. Let's also pretend that TDP = constant power draw, and that idle and average load power use aren't similar.

Maybe it's because I use Linux and open source software not compiled with GCC instead of Intel's compiler

http://openbenchmarking.org/result/1210227-RA-AMDFX835085
 
G

Guest

Guest
Really tom's where are the GTX680/690 card tests with 30" 2560 x 1600 display graphs at? Some of us game at the top level.
 
G

Guest

Guest
processor performance WOW bullshit .This`s GPU not cpu test.It is impossible to i5 3.3GHz was faster than the i7 4.25GHz.

and
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]BigMack70[/nom]You must be able to spit a longggggggggggg way... because AMD's best chip can't match one of Intel's budget offerings. The fact that you would have to overclock AMD's best CPU in order to match a $100-110 offering from Intel says it all. There's no reason AT ALL to buy an AMD chip for a gaming rig unless you're a fan of AMD/hate Intel. Worse performance + significantly higher power draw. There's still some reasons to buy an AMD CPU for multipurpose or non-gaming rigs, but this article is about gaming performance, and for gaming performance, AMD is a no go on the CPU side of things.[/citation]


i couldnt help but comment on such and idiotic post. AMD is a no go ? so you say ... what article are you reading , because what i see , is a dual core chip besting both amd's AND Intel's flag ship chips. this tells me that the game makes no more use of cpu power beyond 4 cores (rather those are actual cores or virtual via hyper threading). to say that AMD is a no go is a rather enormous stretch since the 8350 IS with in spitting distance of the i7 which also got beat out by the i3 on these benhmarks.

aso it shopudl be noted that ALL 3 cpus delviered more than playble frame rates on this game , so again saying AMD is a no go just sound's like pure fanboysim on your part as the cpu quite clearly handles playing the game at more than deccent frame rates.

... freaking retarded fanboys ...
 

mikenygmail

Distinguished
Aug 29, 2009
362
0
18,780
[citation][nom]sugetsu[/nom]"The good news for folks with Piledriver-based processors is that the FX-8350 is nearly as quick as Intel's Core i3-2100 (never mind the fact that the Core i3 costs $90 less)."My God... Are the reviewers of this website paid to make AMD look bad? Any person with a minimum hint of common sense can clearly see that there is virtually no difference between FX 8350, the i3, the i5 and i7. This is a big disservice to the community.[/citation]

Yes, they clearly are.
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
so let me get this straight... crysis is said to be highly unoptimized.

this game looks almost or as good as crysis

but runs a lot worse? we're taking steps backwards it seems.
 

dozerman

Honorable
Nov 14, 2012
94
0
10,630
[citation][nom]eklipz330[/nom]so let me get this straight... crysis is said to be highly unoptimized.this game looks almost or as good as crysisbut runs a lot worse? we're taking steps backwards it seems.[/citation]

Yeah. I'ts like the whole PC crowd complained to the game industry that games weren't pushing graphics hardware anymore, and they responded by making less optimized code as opposed to using the GPU power to draw better graphics.
 

dozerman

Honorable
Nov 14, 2012
94
0
10,630
It really is pathetic that I get better multithreaded support when playing backyard monsters on FaceBook than a big-name title like FarCry 3. WTF? Seriously, I push all 8 threads of my 8150 when playing backyard monsters, but Far Cry 3 only uses two or three. That's pathetic.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]eklipz330[/nom]so let me get this straight... crysis is said to be highly unoptimized.this game looks almost or as good as crysisbut runs a lot worse? we're taking steps backwards it seems.[/citation]
Well it's a variation of the same crysis engine...and btw what this game does on ultra, crysis probably did on "very high".

[citation][nom]dozerman[/nom]It really is pathetic that I get better multithreaded support when playing backyard monsters on FaceBook than a big-name title like FarCry 3. WTF? Seriously, I push all 8 threads of my 8150 when playing backyard monsters, but Far Cry 3 only uses two or three. That's pathetic.[/citation]
FC3 uses 4, at least.

Your backyard monsters doesn't seem to be an appropriate example. It's probably a flash/java game running in a browser? So that's already more than one process, and who knows what process is using how many threads? Also, does backyard monsters use your GPU?
 
G

Guest

Guest

:eek:
i can't explain just how wrong that is. you're comparing 3 different CPUs with 2 different L3 caches (actually 2 of them have none) and arches and saying that throwing a discrete card in makes them the same?
it doesn't work that way.

ok, i get it; you're all happy with what you got for the money you spent. good for you.

but to come into a discussion and announce, "hey guys i got game!" when most have spent more $$ on a graphics card than you have for your whole system, well, don't be surprised to get any negative responses . . even if it is "ignorant" epeen waving.

but i will look you in the eye and tell you that you have a glorified HTPC that can get away with some gaming @ low settings and resolutions. that A6 is a fantastic igpu with a mediocre cpu. the truth be told; throw in a discrete graphics card, since you mentioned it, and even a celeron will spank that A6:
Build It: Picking Parts For Your Kid's Entry-Level Gaming PC
0202_Gaming_Starcraft2.png


now i highly suggest that if you care to continue the discussion you take it off this thread and open your own thread in the forums.

i do like your tenacity :)
 
G

Guest

Guest
Nice Buggy, except that spare tire is completely out of scale with the Actual tires on it....

I wonder what other technical inconsistencies there are?

no thanks...
 
G

Guest

Guest
Looniam: I can't explain just how wrong that is. You took the only CPU limited game on the planet as the basis for comparison that Intel is competitive with AMD on budget IGP gaming? Why not include any other 5 games on the planet to help balance out that graph.
 
Status
Not open for further replies.