Far Cry 3 Performance, Benchmarked

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

pxl9190

Distinguished
Jan 27, 2011
21
0
18,510
I wish the author of this article test 670 in SLI and 680 / 680 in SLI. This article is NOT AT ALL indicative of how Ultra at 1080P or 2560x1600 would look using various configs.

Indeed I had hoped to use Far Cry 3 2560x1600 test results to determine whether I would get a single 670 or 2 x 670 for SLI. The 1080p ultra results seem to suggest for 2560x1600 @ultra even 670SLI is not enough !!!
 

cleeve

Illustrious
[citation][nom]don_small_a_cock_si[/nom]Ah, good to know that my FX8350 is slow... Somehow I hadn't figured out that it was slow yet, don't know how I missed that one.[/citation]

Why do you think the 8350 is slow? It gets almost the same result as the Core i5 and Core i7. Look at the charts.

By the way, I love the username. It's flattering to meet someone who is obsessed with my genitalia to the point of designing their alias around it. :D
 

chaosbyte

Honorable
Dec 16, 2012
1
0
10,510
i3 2100 user here and yeah, it maxes out all 4 threads in this game if it so requires (paired with a GTX 660 OC, game settings are tweaked, mostly at high). And HT not used in games? Not true, tested with HT on and off and there's an increase in performance (not as fast as a real quad obviously) and it sometimes depend on how the game is coded.

this bench, however shows the i3 struggling against the quadies, maybe because it was tested on ultra settings vs medium used by toms.
http://gamegpu.ru/images/stories/Test_GPU/Action/Far%20Cry%203%20v.%201.0.2/fc3%20proz.png
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
@looniam
Even though I have an idea why people thumbed down intel_pr_alert, there is something somewhat related to his/her main point. Whether or not SC2 is generally CPU-limited is irrelevant because the chart you showed was with a discrete GPU, but I remember reading in a TH article before that SC2 does favor an (good) L3 cache or something.

What I'm just getting at is, knowing this, the gap between the Celerons and the APU's might be of the more "isolated-incident" nature (because the latter ones don't have L3 caches). Just sharing my thoughts... :)

Addendum:
I also think how tourist compared the A6-3670K to Phenom and Athlon II's was alright. Feel free to correct me, but I vaguely recall a lot of games not caring for L3 cache. (I remember a time when an Athlon II X3 was a recommended budget gaming CPU.) Also the Llano's have twice the L2 cache (whatever that's worth) and I forgot if the Husky cores exhibited very mild performance (per clock) improvements. The point is though that they may be generally comparable to quad-core Athlon and Phenom II's. I say "may" because I have no benchmarks right not to prove so and I don't want to make claims based on old memories, and I'm not willing to go and look for some (sorry). :)
 

Phyrexiancure

Distinguished
Mar 28, 2011
316
0
18,810
[citation][nom]army_ant7[/nom]@looniamEven though I have an idea why people thumbed down intel_pr_alert, there is something somewhat related to his/her main point. Whether or not SC2 is generally CPU-limited is irrelevant because the chart you showed was with a discrete GPU, but I remember reading in a TH article before that SC2 does favor an (good) L3 cache or something.What I'm just getting at is, knowing this, the gap between the Celerons and the APU's might be of the more "isolated-incident" nature (because the latter ones don't have L3 caches). Just sharing my thoughts... Addendum:I also think how tourist compared the A6-3670K to Phenom and Athlon II's was alright. Feel free to correct me, but I vaguely recall a lot of games not caring for L3 cache. (I remember a time when an Athlon II X3 was a recommended budget gaming CPU.) Also the Llano's have twice the L2 cache (whatever that's worth) and I forgot if the Husky cores exhibited very mild performance (per clock) improvements. The point is though that they may be generally comparable to quad-core Athlon and Phenom II's. I say "may" because I have no benchmarks right not to prove so and I don't want to make claims based on old memories, and I'm not willing to go and look for some (sorry).[/citation]

L3 cache doesn't always matter, but in many games there can be as much as a 20% difference in performance for comparable CPU's, Athlon ii vs phenom ii. From my experience the games that depend on L3 cache more are AI heavy games like RTS's and left 4 dead, also MMO games.
 
G

Guest

Guest
@army_ant7
sometimes i get a kick out of people crying/complaining "that is an unfair benchmark!" its not like i scoured the internet to find any screwed data; i had that specific article in mind while watching the "debate." its a good read and is complementary of the Llano.

but the bottom line is:

SC2 is a game, no? and the claim was made that a A6 would game and better than intel since intel had no configuration at the same price point. the former could be debated based on favoritism and the latter is entirely incorrect; esp concerning intel motherboards being more expensive. features dictate a motherboard's price, not platform (unless you account for an "old" system).

to direct any L3 cache "issues"; well how can anyone call their rig a "gamer" when it can play these games but not those games?

they can't.

now as i suggested before, lets drop the off topic discussion unless someone cares to open a thread in the forums. yeah now THAT is a challenge . . . :lol:
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
@looniam
Well, if you put it that way (trying to disprove someone's claim that something is absolutely better than another thing), then I don't see anything wrong with what you did. Not that I was saying you were wrong before, but I was looking at the matter in a more general manner, since, as you probably know, even high-end systems can be "impaired" in performance depending on the application (and its settings). Even a high-end AMD graphics-equipped system may be impaired when PhysX effects are turned on. (Actually, I'm not sure if a huge performance drop is still the case with a system with the latest high-end Intel CPU and AMD GPU.) But anyway, it's a little unfair to compare an Nvidia-equipped system to that one, just using one game with PhysX effects on as the test. Again though, just in the context of general gaming capabilities. :)

Also, I think tourist was pointing out that you can't create a comparable (gaming-wise) Intel system, if you're absolutely limited to that budget (range) he/she mentioned. Though, that's an "ideal" situation, I guess tourist was just proving a point that AMD can still cater to some people. :)

Yeah, I guess it would be best to put this in another thread, but hey, sometimes you can't help but deviate from the main topic... Hehe... :p Was just helping in setting things straight since the topic's already out there. :)
 

williehmmm

Distinguished
Apr 7, 2010
33
0
18,530
Cleeve - if you're still about, could you have a little look at this.

FX 4170 4.2Ghz - a 16% overclock on an FX 4100 3.6Ghz. But the same chip.

The FX 4170 delivers a 44% frame rate increase over the Phenom II 955, yet the 955 is in a tier higher on the CPU Hierarchy chart than the FX 4100.

By my reckoning the FX 4100 would deliver something like 28% higher frames than the 955.

As I've said from day 1, the FX 4100 is in the wrong tier. The FX 4170 is 2 tiers higher and the only difference is a 16% overclock.

The i3 2100 is 3 tiers higher, but previous more comprehensive benchmarks show an average of 10% higher frame rates.

My own first hand experience saw a core 2 duo e8500 (same tier as FX 4100), have a 50% frame rate jump in Dirt 3, GTX 470, 4GB DDR3, 1280x1024, 40fps for the e8500, 60fps for the FX4100.

Yet if I based my upgrade route on the CPU hierarchy chart, I would have expected no significant increase and avoided that CPU.

I appreciate this is 1 game and 1 configuration. The FX 4170 seems to be in the right place, but should the same chip with 16% less performance really be 2 tiers lower?

Is it really worth upgrading from an FX 4100 to a 3 tier higher i3 2100?

Is 10% average performance really an upgrade or is it somewhat parallel?

http://www.tomshardware.com/reviews/gaming-fx-pentium-apu-benchmark,3120-10.html
 

cleeve

Illustrious



The heirarchy chart will never be accurate on a specific game level. It's based on averages.

Having said that, I'm about to put together another sub-$100 gaming CPU roundup and I'll have a closer look. :)
 

williehmmm

Distinguished
Apr 7, 2010
33
0
18,530
Cleeve - your attention is appreciated sir.

My FX purchase was back when m/b prices made the AMD solution cheaper and before the reviews were out, otherwise I never would have, but I just never thought it was as big a dog as it was made out, just not quite as good as the intel offering.

Looking forward to your sub $100 round up.

Thanks.

 

doubletake

Honorable
Sep 30, 2012
1,269
1
11,960
[citation][nom]sayantan[/nom]This game can be really demanding on CPU depending upon the environment. In a firefight that involves flame throwers and explosions along with some AIs , you can see the framerates drop from 60 to 40 in no time. Also I would like to mention that game stutters like hell with anything below 60 fps . Even 57 -58 fps is unplayable and gives me headache. So it is essential to tweak the settings such that the fps is above 60 most of the time. The good thing is if you have a decent system you can maintain 60fps without loosing too much visual fiedelity. I can run the game at 0x AA @1080p with all other details maxed out using OCed 7970(1060,1575) and 2500k(4.0Ghz).[/citation]

I found that this stuttering is due to the pre-rendered frames setting that you choose. When I set it to 0 frames, I get that terrible stuttering at all fps ranges, 40-60s. By setting it to at least 1 frame, the stuttering is gone, and mouse input is greatly improved.
 
G

Guest

Guest


A) no drivers since then has given a 20%-30% increase; ~10% at the most and that is for 7950/7970; other AMD cards are less and very dependent on the game benched. you are talking utter nonsense. btw, that would also reflect on the discrete card with the celeron so that in itself invalidates your argument (ahem, SPIN).

B) make all the claims you want about unlocked this benching evenly with locked that. unless it has been tested by a reputable site under controlled conditions; it doesn't exist.

C) go into the build section of the forum and post your build and find out how many intel builds will be able to GAME for the same price point.

D) trolling would be butting inbetween a discussion between two different people and stating unproven claims. hhmmm lets see, i laughed at your build, showed why i laugh at you and provided a venue to further the discussion . . . yep looks like a troll . .
you-mad-bro-625x416-c


i am done wasting my time unless you do follow up with a discussion in the build section:

http://www.tomshardware.com/forum/forum-31.html

because as much as you want it to be about you, this discussion thread isn't . .
 

Phyrexiancure

Distinguished
Mar 28, 2011
316
0
18,810
[citation][nom]danielday765[/nom]until I looked at the bank draft for $6417, I didn't believe that my brothers friend was like realy bringing in money in their spare time at there computar.. there best friend has been doing this 4 only 14 months and resantly cleared the debts on their mini mansion and purchased a brand new Alfa Romeo. we looked here, www.ask22.com[/citation]

But this is in single player scenarios, multiplier gives totally different results. Since the games takes scales well with up to 6 cores the AMD quad cores will tie with intels dual, while 6 cores and up will be behind of equal to Intel's i5 cpu's. I personally think using old games that don't take advantage of more than a dual core shouldn't be seen as the standard for gaming benchmarks. For the most part AMD cpu are good enough in those scenarios, with exceptions like StarCraft II's worst case scenarios. We should be using newer games like Battlefield 3 because they accurately describe the performance differences between intel and amd cpu's in near future games.

 
G

Guest

Guest

CONGRATULATIONS!

as much as you have been prodded to putting your build up in the build forum to compare it to other budget oriented builds and take it off of this far cry3 discussion thread; you still want to continue discussing your glorified HTPC that can play some games @ low settings and resolutions. and obviously you fail to see i was being sarcastic and not admitting anything.

it would surely appear to an outside observer that you maybe suffering from high density matter within your cranium region. fortunately as with advances in technology there has been phenomenal advances in the medical field for people suffering with such a challenge so that they may experience a better quality of life. it would even be financial beneficial in allowing medical professionals to study such causes and affects so they could better serve people with such an affliction.

with that in mind please read this several times until you understand; your build is junk.

A) you picked a motherboard with an already EOL socket by a far from stellar manufacturer.
B) a PSU that isn't good for much but eventually roasting wienies when it blows.
C) a case that isn't worth the box it was shipped in; you would have been better off using the box for the build.
D) an APU that has a great igpu but mediocre cpu side and without a decent heatsink to compensate is destined to be nothing more than competitive to a netbook.
E) and average RAM that can only perform well when overclocked but that POS motherboard will limit.

and you spent $188 for that?!? you got ripped off. i see better systems on my local craig's list for $150.

and now i am the third person to bid you a fond farewell.
 
[citation][nom]sayantan[/nom]The good thing is the game doesn't scale up with intel CPUs making the 8350 really look good in comparison.[/citation]

If it scaled up on Intel, then it would undoubtedly scale up on AMD as well. Increasing effective CPU utilization works pretty much the same way on any CPU that has enough threads for the game. If even up to the i7-39xx CPUs were well-utilized like they probably are in the multi-player mode, then the FX-8350 would probably hang around an unlocked LGA 1155 i5 or unlocked LGA 1155 i7. Sure, power consumption would still be high, but that doesn't stop performance from being good.
 
[citation][nom]tourist[/nom]I would say don't let the door hit you on the way out but based on your reply's. But You can't even figure out which way you are going considering your spin about how an apu cannot game with a discrete, the image above proves your wrong. http://www.tomshardware.com/forum/375787-31-intel[/citation]

It depends on the situation. Sure, in single player BF3, the CPU is usually almost irrelevant, but once you get into multi-player, it becomes much more important. For example, you can bet that even that i5-2400 from that review can muster up a 30-50% lead over the A8-3870K at stock for both. Not even overclocking covers the distance because the i5-2400 can overclock to about 4GHz despite not having an unlocked multiplier and none of AMD's APUs can overclock far enough to meet that level of performance in situations that can use it.

However, that doesn't mean that performance would be bad with a good APU even with high end graphics in BF3 multi-player or similarly CPU-intensive situations; it just means that it won't be as good as some other CPU options, primarily more expensive options at that. For some cheap builds, I most certainly do like to use some APUs, although I generally prefer to use Trinity over Llano because it is supposed to have socket/platform compatibility with its successor and has some other advantages over Llano such as decent performance advantages and power efficiency advantages.
 

roghero

Honorable
Sep 11, 2012
27
0
10,540
I have an MSI GeForce GTX 660Ti PE and I am getting constant 60 fps with everything on high at least according to fraps.
 

cleeve

Illustrious
[citation][nom]roghero[/nom]I have an MSI GeForce GTX 660Ti PE and I am getting constant 60 fps with everything on high at least according to fraps.[/citation]

Probably not in the same area we benchmarked at 1080p. :)
 
G

Guest

Guest
newest nVidia drivers released 12/17 claim a 38% performance increase in FC3...

I installed them and played for the first time last night. I have everything cranked @ 1920 x 1080 w/ a single 670 & a i7 3 ghz intel quad core (LGA 1366) w/ 16gm ddr3... plays just fine for me, sorry I didn't measure my FPS, but I saw no reason to as the game plays very smoothly.
 
G

Guest

Guest
I bet this game runs terribly on both the PS3 AND 360! Both very under powered compared to my setup and I only get 40fps on medium settings for this fab game!
 

Lord Captivus

Distinguished
Sep 13, 2011
139
0
18,680
Tom should use other GPUs, i mean the difference between some of them is minor.
Why dont try "older" GPU and see what happens. Maybe 200/400/500 series, they high end ones.
This game better get patched in order to increase FPS...less than 40 its unplayable.
 

atminside

Distinguished
Mar 2, 2011
134
0
18,680
I have a Gigabyte 790xta-UD4 + PII 955 X4 + HD6850 (1 gig) + 8 Gig mem + 27" Samsung 1920X1080. Would getting an extra 6850 for Crossfire have any noticeable improvements? I only ask because I heard that Cross fired 6850's have some bad shuttering issues and bad frame drops. Obviously I would like to run FC3 on all highest settings possible. Or should I forgo my current 6850 and get a 7850 or GTX 6/560? The Charts FC3 didn't show any 6850 listed so I am not sure how even one would perform under my desired settings.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
My usual RPS link, this time i held back for some (unknown) reason, but here's a different take on the game, and a chat with the game's author.

http://www.rockpapershotgun.com/2012/12/19/far-cry-3s-jeffrey-yohalem-on-racism-torture-and-satire/
 
Status
Not open for further replies.

TRENDING THREADS