Is the AMD FX 8350 good for gaming

Page 20 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


+1
 


reasonably future proof, the only thing you miss out on is the games now that don't run so well on FX cpu's, like skyrim or starcraft 2 for example, although they still might run "well enough" depending on your expectations. http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/5
Assuming future games will be well multithreaded, the 8350 is a good cpu.
Here's one of the most recent CPU comparisons you will find: http://www.anandtech.com/show/6934/choosing-a-gaming-cpu-single-multigpu-at-1440p/5
 
well, greatly done internet, you had me really confused -_- , am changing my mind on a daily basis 😵

anyway, am not interested in MMO games, i think am going with this:

-ASUS DDR3 1800 AM3 Motherboards Sabertooth 990FX/GEN3 R2.0
http://www.amazon.com/gp/product/B00C0RCKY0/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A4UCFL9LU89NR

-AMD FX-8350
http://www.amazon.com/gp/product/B009O7YUF6/ref=ox_sc_act_title_3?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

-Sapphire Radeon Vapor-X HD 7970 GHz OC 3GB
http://www.amazon.com/gp/product/B008PQAE98/ref=ox_sc_act_title_7?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

-Samsung Electronics 840 Pro Series
http://www.amazon.com/gp/product/B009NB8WRU/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

For future: another 7970Ghz as Xfire

what do you think?
 


That looks solid...get something like a 500GB/1TB HDD for storage and you're golden. Also, don't skimp on PSU, get something from Corsair/Antec/Seasonic/PC Power & Cooling.
 
Hello everyone again ^_^,

I would like to just put out there, most games today do NOT take more than 2 cores, 3 at most. Yes it will utilize all the cores (for the most part), but the workload amount is usually less than about half the core can support. Also the fact that most games today that people will play that are top title sellers are GPU based. Meaning, I could have a Core 2 Quad running at 2.5 Ghz playing Crysis 3, however, for the most part, as long as I have something like a GTX 680 or more, the game should run no less than 30-40 frames on max settings.

On the other hand, if I have an i7-3960x clocked at 4.5 Ghz, and an Nvidia FX 5700 running at stock, the game will run no better than 15-20 frames on low.

Please keep in mind, when testing, running games that are more GPU based will not solve anything. Games that would really work are games like Batman: Arkham Asylum, League of Legends, WoW, Starcraft, etc. You get my point.

So if you are posting GPU based benchmarks with only one GPU, or even one from AMD and Nvidia, it will still not be valid for this kind of forum.
 


LOL...Crysis 3 is one of the most CPU bound games on the market right now...it shreds older quad cores and brand new dual cores...you need to read about this stuff first man. Seriously...
 


So basically what you are saying is that you can run a 3960x with the worst GPU on the market and still get 60 frames?

Edit: *On max settings?
 


No, but you can on minimum settings with a current generation entry GPU card...

If you gave a i7-3960x with HD 7990...run it on max settings, you'll probably get over 70 FPS with max settings.

On the contrary, an i3 anything with a HD 7990 won't get past about 25-35 FPS (On max settings) because the CPU is the bottleneck. If you drop to lower settings the frame rates don't improve dramatically either, because frankly, the i3 is not enough CPU for Crysis 3.
 
8350, to your last post:

That is true in theory, but when you look at actual tests, the only difference with the i3 from say an i5 with a 7990 is no more than ~10 frames on max, but those are two completely different processors. it's funny people even mention i3s for gaming, they about as good as a Core 2 Quad running stock.

However, on our topic of really anything at or better than the 8350 or i5-3570k, there's not much noticeable difference.

The problem with running games on the best GPUs with the best CPUs to test the CPUs, there wont be any bottle neck for the CPU if thats the case. Thus, instead of using max settings on games, either use at least 75% CPU proficient games, or turn the settings down.

For example, in a response video for Tek Syndicate, there was a test between the 8350 and i7-3770k, since many people were claiming the 8350 dominated, or even did 50% better than the i7, which is also what you're claiming: "The i7 can beat the 8350 in some tests." Basically what you are saying here is that its better, when it's not.

I've posted this video many times before and people ignore it because it's not "a known company that everyone loves."

They used rigs that were exactly the same with a GTX 670 and motherboards that were almost 100% alike (obviously a AM3+ can't fit in a LGA 1155, but everything else was the same).

The test proved that the i7 dominated the 8350 (stock settings), in every test, except the Mega hash test, which is why people get AMD for Bit Mining and video editing.

These tests also included CPU based games, and even multiple tests that were AMD synthetic benchmarks, and AMD biased tests. It even beat the 8350 in Batman: Arkham Asylum (Mostly CPU based)(also, obviously lowest settings to prevent GPU bottelneck, and leaving it up to the CPU for everything else) 320 FPS vs. 470 FPS.

Keep in mind, every variable was the same as possible, and they didn't just use a few games on Highest Settings, which are most GPU based. Also keep in mind what CPUs can utilize with GPUs and vice-versa.

Tek Syndicate had different: RAM, SSDs, Mobos that were completely different (other than the socket fact), PSUs, HDDs, and coolers.

That's the main reason that Tek Syndicate is not a good candidate for benchmarks: Highest Games Settings and Different Setups.

Last thing to keep in mind: posting over 1400 posts in two months, and having a badge saying that you were selected best on FIVE threads, doesn't make you an expert.
 


First, you haven't read a single word I have said evidently.

Second, I could care less about whatever you dug up to put intel into a good light.

Third, just because some moron ran a test you claim was "unbiased" does not mean that things were done in such a manner "because he said"

Fourth, do you buy a PC to play games at the lowest resolution on the lowest settings, or do you buy it to play the games at 1080p/1440p on max settings?

Fifth, I have actually been selected as the best answer 19 times...and the forum is setup that way, if you have grief with it contact the forum admins. I notice you're posted several hundred times and haven't been selected more than once or twice...do you not realize there's a reason for that?
 
I have read everything you have been saying, and understanding what you are saying, I'm giving you the truth to your madness. There's a man on this forum who posted on here, in regards to what you were saying. He only posted about 400-500 times in the 2 whole years hes been on Toms Hardware, and he has a CPU legends badge, your CPUs expert badge, and a few other badges that come with respect, not popularity.

And yes I buy games to run them at the max settings, who wouldn't?

Also, it's not even what he said, the man in the video posted SCREEN SHOTS of the applications and their results. How can you deny those?

And as a matter of fact, I will do something about you.
 


The no-named source is the well-known eurogamer article that run the poll among triple-A game developers. Moreover I don't know if you are being deliberately dumb or what, but please explain us how your professional reviewers are reviewing games not released still but under development...




Aaahhh, techspot, that professional site that does not always give the hardware setup of the FX used in the reviews, but uses SP1 without the FX hotfixes, likes to use catalyst beta drivers in the Nvidia vs AMD comparisons and, I am almost sure, runs the FX with underclocked RAM.

You can compare their 'review' of crysis 3 with this one

qSNrpeA.png
 


Some valid source about the Tek Syndicate reviews? Once I did read a blog entry against them, but I closed after reading three nonsenses.
 


In what world is what I'm saying nonsense? I'm telling the naked truth, that anyone else can compare and agree, look for yourself!

"but please explain how your professional reviewers are reviewing games not released still but under development.."

Juangra, You basically contradicted yourself there. How can we all be sure if there's no proof?

Also, look at your chart:

1. It's over 1 year old

2. It's only at 720p and there's no AA/AF

3. It doesn't specify what GPU it used, all it said was GK110.

And yes, I'm saying, "why isn't it highest settings?", but only because that's what you think is right. You guys are so dense you can't realize that max graphics is GPU based, unless you use CPU based games, which you are not providing.
 
G0M3R, I know you got your results from Lavo and Price. They benchmark in their garage and they jacked up on their review of the H100 a while back. In fact, they removed the video. Tek Syndicate atleast uses real world benchmarks. And the benchmarks used by Lavo and Price were using the OC benches and they used clock for clock, not increase the FX 8350 by 500Mhz and then the i5 my 500Mhz. I will give you the benefit of the doubt and say they didn't mention on weather or not they bumped up the IPC on the FX chip either. Honestly, that could make a huge impact on performance along with the RAM running OC'd.

You don't understand that an FX chip can reach higher speeds with the same cooling.

And G0M3R, you're not posting any sites. How do we know you're only telling the truth? At least some of us actually own the product or have used it.
 

LOL! Hey man, If you can afford it, Buy it! That's a crazy rig right there! I'm Jelly!
 


Keep in mind he said "Plug and Play". There's no specification on this thread saying what we are talking about in terms of clock. I'm showing you with this video, the absolute accomplishment that each Company has to show us and users that don't OC or touch anything.

And really what I'm implying from, "You don't understand that an FX chip can reach higher speeds with the same cooling." is that you are basically saying, "There's a need to OC the FX chip."

And as a matter of fact, I have posted at least 2 videos, and at least a website of information, don't say I'm not proving anything. And just because you have resources doesn't mean you or the resources are right. This goes back to what I was telling 8350. Just because you have One easy-to-get badge and over 1400 posts, doesn't mean you're right.

You know, If I came on here with an account that was approved to be: official from say "Tek Syndicate" or some 'high end' company/organization that everyone loves and only pays attention to them, and posted something without giving any resources, just text, that you would believe them?

This world is based around people having power, and basically in a broad sense: Power = Knowledge. In other words, "Look, I'm (fancy company name here), and the Core 2 Duo stock is better than the i7-3960x stock!"

That is a little exaggerated, but its so you can somewhat understand at least in a simple sense of what I'm telling you.
 


Would it be because they are using a 3770, not K, with a 7990? Just speculating, but, lol. ^_^
 


A: *Edit* The benchmarks you refer to are mostly old. Done with a different scheduler too.

B: Here's where you dun goofed! Most intel fanboys complain about AMD limiting FPS when at low settings. Ex: Earlier in this thread (I think it was) you who used a benchmark of skyrim on low settings as an example that intel gets 500FPS and AMD gets 350FPS, and that benchmark was to show the 8350 bottleneck. SO, What this benchmark shows is that neither CPU is bottlenecked, Not that the i5 sucks or the 8350 sucks. It's impartial. Also, LavoandPrice played Skyrim and got the Same FPS except for a mini crash at the of their run on the AMD system that was mainly due to bad coding.

C: It's clearly a GTX 680. You didn't go to the actual site, did you? Learn to read bro.
 


A: Edit: (Oh) Then whats the difference if you posted something from over a year ago where there are bound to be: Driver updates for the Game and Hardware, and updates to the game (performance/graphics)?

B: Even if it was bottle necked or not, the results show.

C: Then specify it, if you want to get results from people looking at the THREAD

D: Letters, nice way to look smart!
 


Lol, I corrected your statement. And again, The FX 8350 is more future proof than an i3. So, this Piledrives the i3 out of the picture, so that makes the FX 8350 competitive with the i5. Games right now run good on an i3 no doubt, but 1 year down the line, your Goose is plucked. Not to mention the next gen games coming out this year will support 6 or 8 cores. Which means you will be able to play all the new games at around the same FPS as an i5 or i7 while getting the better multi-threaded performance.
 
Status
Not open for further replies.