Is the AMD FX 8350 good for gaming

Page 18 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Gracias. I'm just saying. Technology changes rapidly, but adoption is slow. HD technically happened in the 1960 and 1980s. There were prototypes. It was released to the public in the '90s. But most content today still isn't in HD. If you get a 4k tv now, you're looking for at least anotehr 8 years before viable, true 4k content is avaliable.

I understand the logic to getting an FX 8350 for its cores, but it's ironically limited by AMD itself. Until AMD offically has their APUs shipped in the Xbox One and PS4....the industry is going to have to wait.

Consoles seem to be the forefront of technology pushing. Nintendo 64-bit (not true 64 bit processing, but you know). HD (Xbox 360 being of the first, even though Xbox came out supporting 720p). Online gaming (Sega Dreamcast). I mean, these started on PC fiirst but consoles 'modernize' it to the masses. Just my theory.
 


I think the people running E8400s and Q9550s would argue against that logic...Also, the people running Athlon II product would probably argue against that as well as the people who still own and run Phenom II X4 CPUs. Here they are, at least 4 years old...and still going strong in modern games that are using their capability more and more now than games in their day were.

Now, that doesn't mean they're the best for newer games and applications...but they're also hardly obsolete either.

Future proofing is a very valid approach if you have a longer upgrade cycle. It also means that instead of doing nickel and dime upgrades, you can do full system overhauls.

In terms of looking forward, I want the most capable technology available, and something that looks like it will still be relevant in 2+ years. I have 4 PCs in my house, and about every 3 years or so I do a complete scratch build and repurpose the others. When one gets old enough that it can't keep up anymore, I just toss it or donate it or give it away to someone who needs a PC period.

With things shifting more and more toward more cores, I think in the next 2-3 years we will see the final death knell for dual core CPUs in mainstream desktops. They will still be around in web browsing rigs, etc. though their mention won't even be uttered in conversations about gaming CPUs.

If technology were obsolete so quickly, why do so many now still run 4-5 year old architecture? By your logic, that shouldn't happen...though a great majority of the world is running last generation or older hardware.

Future proofing is highly relevant.
 


Sites as tech spot have a tendency to make reviews giving advantages to Intel and Nvidia. They usually run memory underclocked on the AMD chips or lack FX hotfixes, or when do an Nvidia AMD comparison they use beta drivers for AMD cards...

I already explained you that:

There are some scenes of the game where the FX-8350 performs worse, but it is not related to being less-threaded on those but to some issues with the floating-point computations.

What you say about games scaling poorly with increasing core count must be corrected to games using few threads scale poorly when going above the optimal number of cores.

But I may concede that I was wrong in one thing. Above I wrote:

The game is not optimized for AMD, the advantage is exclusively coming from multithreading, which helps to use the 100% of the FX chip unlike older games.

I was wrong, Crysis 3 is not using the 100% potential of the FX chip. The game is not optimized for six or eight cores. The game is better than older games and can use "moar cores" and that is why the FX outperform the i5-3570k and the i7-3770k, but crysis 3 is not still showing the real performance of the FX chips.

In fact, crysis 3 is loading only two cores of the FX-8350 above the 90%, whereas four cores are loaded slightly above 60% and the remaining 2 cores are loaded between those two extremes. That is, crysis 3 is using between 2/3 and 3/4 of the FX-8350. Maybe crysis 4 will be able to load the eight cores at 94%, then we will see the real gaming performance of the FX-8350. Next gen games will be heavily multi-threaded thanks to consoles being 8-core chips.

Finally, as you say people care about the platform that they use. Therefore Windows users don't care about linux benchmarks and linux users don't care about windows benchmarks.

Moreover, in the desktop, there is about the same users of linux as users of W8. There is no reason to give benchmarks of one and don't give the benchmarks of the other, unless you only want bash AMD chips because linux can use the performance of AMD chips.
 


Because the triple-A game developers have said that they will code for eigth-cores?

Because the same developers have said that they recommend the FX-8350 as the best cpu for future gaming?

Albeit the PS4 and the Xbox One are not still shipping, some of those developers have already presented a demo of their future games and the demo is already using six cores. This is the cpu profiling

profile.jpg.jpg


The game will be released with the PS4.
 


And don't forget the future Steam Box consoles which will be based in linux (with windows as second choice).
 
There are benchmarks that show the 8350 to out perform the i7 in Metro 2033 when the 8350 was using 1866Mhz RAM. I don't know where you guys are getting this 2% number from.

@dirtyferret, PS4 - One = 3

HL3 confirmed 😀 lol jk.

@jaunrga, nice finds. It was indicated in another thread that Crysis 3 prefers the Ghz as opposed to ICP at certain times.

The point everyone is missing is that for less money, you can have more cores, better future proofing, and a playable experience with a FX 8350. I refuse to buy another quad core, even an i7 because I know that in the near future, that isn't going to cut it. I'm going to run my Phenom into the ground before my next purchase. Lol.
 


I swear to god in another thread you knocked those hotfixes for improving gaming performance by almost nothing but increasing power consumption by 7%.

Fact is games are hard to scale for. The more threads you break your game into the more times you have to have one thread ask another thread for a piece of information. This information transfer takes time and reduces efficiency. Furthermore their is a limit to how much you can break a game up into pieces (both limited theoretically and by time/money constraints).

Crysis three is probably one of the best multithreade games on the market, I think it'll be a while before it becomes mainstream to do that much multithreading.

And I'll also bet that the number of computers running win 8 and capable of running crysis 3 is higher than the number of computers using Linux that can run C3.

Not to mention that C3 doesn't run on linux.
 
if anyone owns crisis 3 here and have fx8350 then run this test to see how much cores crisis 3 use

test
clock down the fx8350 to minimum (say 1ghz)
and then run crisis 3 and a cpu monitor like task manager
and see the total cpu usage and post the chart/picture of taskmanager
 


The hotfixes improve performance between a 2 and a 10% depending of the game. 5% is rather common. This alone is enough to close the supposed gap between the i7-3770k and the FX-8350 in the 'review' that you gave. Not to mention that by using the correct RAM at stock speed gives and extra boost for the FX.

One does not split a code into threads that depend on each one. One splits the code in parts almost independent. The game developers choose the eight-core design of the PS4 after several technical meetings before the hardware was designed. Developers such as Crytek also ask for "moar cores".

First generation games for the PS4 and the Xbox One are using more than four cores. We already saw some demo of a future PS4 game using six cores. This is only the beginning.

And I'll also bet that the number of computers running win 8 and capable of running crysis 3 is higher than the number of computers using Linux that can run C3.

Really?
 


RAM= no performance gains for gaming (maybe in encoding or winrar but for games the average is 0-2%; yes there are cases where its higher but the average is 0-2). The hotfixes bring no appreciable performance gains. You cannot take the absolute best case scenario and apply it as if it is the minimum guarenteed.

You cannot spilt a game into fully independant pieces.

Given the choice between 8 or 4 jaguar cores devs will gladly take 8. They are weak cores and only 4 would be a huge bottleneck for a 7850 class gpu.

 


Post a link to a technical expert backing up your claims...or quit posting nonsense. Any expert I have ever read concedes the FX8350 is better than just about any intel when the cores are loaded (any 4 core intel). The issue is getting software threaded enough to load all 8 cores.

If I hear you compare an i3 to the FX8350 one more time without posting relevant information from a credible source...my head might explode.

You post more misinformation than anyone I have ever encountered in my entire life...and that's really saying something.
 
If I hear you compare an i3 to the FX8350 one more time without posting relevant information from a credible source...my head might explode.
http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html
now fight with toms reviewer/tester for why they are placing i3 and 8350 in same slot 😉

so no one have dare to post results after this test :heink:
test
clock down the fx8350 to minimum (say 1ghz) and then run [strike]crisis 3[/strike] any game that you have and a cpu monitor like task manager
and see the total cpu usage and post the chart/picture of taskmanager
this test will show if the game is only multithreaded or realy use all cores
 


They also put the 1st gen core i7 series in that bracket too, and the FX8350 obliterates those...what's your point? Tom's isn't spot on about everything, everytime. As I said before, they are objective in most of their benchmarks, at least as much as you can expect. However, when it turns to "rating" situations...they have human bias like anyone, and they try to hide it...however, the FX8350 and I would even argue down to the FX6300 really deserve to be in the top bracket with the newest core i5's. They will lean slightly toward team blue when they "freestyle" their ratings...

I have even stated this before. There are far too many benchmarks that show AMD FX8350 is WAY better than i3's and gets comparable performance to i5's and i7's for them to be this naïve for this long and lump the FAR WEAKER CPUs in with the most potent AMD models.
 


Nope. It is well-known that Anandtech reviews are biased pro intel. They select hardware and/or software configs for favouring Intel chips. For instance, they still continue using Sysmark and other biased benchmarks that give fake scores.

http://semiaccurate.com/2011/06/20/nvidia-amd-and-via-quit-bapco-over-sysmark-2012/
 


I believe that you should also post credible information. Tek Syndicate, is biased towards AMD cpus, and doesn't test in a fully correct manner. Find another right source, and make sure that it can back up your biased username.
 


Man... Tech Syndicate? Their OC benchmarks were done 500Mhz for 500Mhz as opposed to Clock for clock. And you want credible information? YOU'RE THE ONE WHO CLAIMED YOUR FRIEND RAN AN i7 WITH A REGULAR COOLER, UNDER LOAD, AT 1C!!! Not to be mean, but come on man...
 


Proof?



LOL, if that was half true they would not continue using cheating benchmarks with the cripple_AMD function neither would select unfair hardware settings.
 
Runner up? We know for gaming the i7 and i5 get more FPS. But the 8350 isn't as far off as people claim. That's all we're trying to prove. Not that our product is better. However, the next generation called steamroller will be better than Haswel I'm sure. :)
 


Yes, consensus is clear, all triple-A game developers say that the FX-8350 is better than the i5-3570k for future games.

For old games consensus is again clear, the i5-3570k is better.

For the rest of games both are good enough and you can select any. The FX gains in some games, the i5 gains on other games.
 
As I have been saying all along...90-95% of the time, the benchmarks show both CPUs within margin for error of each other. What does that mean? You can't go wrong...

Now, will this change with 8 core consoles hitting the market around Christmas time? Certainly. How long will it be before we see the results of the influence these consoles have? I would bet about 6-12 months before the results begin to be really prevalent...and probably 12-24 months before the majority of games developed are 8 core aware. Considering game development cycles...1-2 years is likely how long it will take before the vast majority of games will use 4+ cores predominantly.
 


If steamroller is as potent and popular as many are hoping, there will be a great many steamroller CPUs taking advantage of the games that are really harnessing the potential. Additionally, considering the Steamroller FX CPUs will likely not be out until at least when the consoles hit, if not the following quarter...they may do something like Phenom II series or Richland and simply do a product refresh with some minor performance tweaks if it sells and performs well enough.
 


Skipping to this part intentionally because...

I am going to write this down on my calendar...you and I agreed about anything!

Just curious...are you feeling well? (Jesting)
 
I don't view the status as negative, but what irks me, is the people who portray it to be a HUGE difference in performance and then quote a benchmark with 3 fps difference between the 2, and both framerates well over 100 FPS...

That's absurd...and realistically, it's exacerbating a slight edge in some areas by making it appear to be much more than it is. There are a few edge cases where performance difference between the 2 is fairly dramatic...but those are the exception, not the rule...(looking squarely at iTunes for intel, and encryption for AMD).
 
Status
Not open for further replies.