Is the AMD FX 8350 good for gaming

Page 13 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
You can actually run a dual boot system with a ubuntu partition and a windows partition, and I know at one point, you could install it through windows and run it that way, though I am not sure that the latest release offers that yet.

Though, if you get your hands on an install disk, you can actually "test drive" ubuntu from the disk without installing it. Which is pretty nice...the applications for ubuntu are also setup a great deal like a google apps pags from a smart phone in terms of the way the software "feels"...which is a very interesting mechanic. Plus as a user you can register with ubuntu one which will give you 6 GB cloud server capacity for music/movies/games etc. that can all be accessed across platforms from windows machines/smart phones/mac/etc.

It's a very modular and simple OS, and very interesting. I tried RedHat linux years ago, and wasn't thoroughly impressed at the time, but the features they've adopted have come a long way, and it really is at a point now where it rivals windows for applications and development, albeit all through open source communities, but I rather like that idea myself. No proprietary code, you can adjust the system to suit your needs...plus, the new UI for the OS is a lot better than the old GNOME interface. I simply didn't care for the way it looked before, and functionality felt a bit clunky.

 


Baby steps...

I have no illusions that ubuntu will replace windows anytime soon...but wouldn't it be crazy to see them gather a large percentage of the minority...like...25-30% in the U.S.
 


I would have said you are crazy a year ago but with windows 8 being shoved down people's throats I wholeheartedly agree with you.

The thing is windows is a lot more 'idiot-proof' than ubuntu which really matters for the average consumer.
 


Ubuntu is the most idiot proof linux out there...plus the UI is not drastically different from something like windows7.

"idiot proofing" is also what makes windows so bloated and slow too though.

Canonical provides free tech support on the LTS releases for 5 years too. Those usually occur every 1-2 yrs, and it's usually a .04 release.

Their update schedule is .04 releases in April and .10 releases in October.
 
We will definitely see Ubuntu gain prominance this year and in 2014 especially on systems designed to be gamijg machines that are pre-built. I truly hope to see Ubuntu become more widely accepted and used in all areas of computing, I think there will be some time but the market is opening for it. We need more competition as is, I personally am sick of using Windows on my home built systems and would love to see more developers work on programs for Ubuntu so that I could still be using the software I need to without being forced to use Windows. Now that some gaming setups are being sold with Ubuntu we will see more of a drive for software based to run on that os.
 


The day that a blockbuster gaming title is released on windows and linux simultaneously spells the death knell for Windows.
 
Um no. Windows has a lot of compatibility and support. With ubuntu you better know your way around a computer because the fix can be quite difficult. Certain drivers, etc can be a nightmare. Windows tends to either work or not work.

An idiot can use windows. Its harder for the idiot to use ubuntu.

No to mention that as ubuntu gains marketshare there will become a ton more malware for it. Its much easier to develop malware for opensource.
 


Thanks by the laugh :lol:
 


Ubuntu is already more secure than windows, it ships with a Firewall that can close individual ports and AV software that's effective. For example, recent exploit discovered in windows kernel that was patched in december 2012 was already fixed in the linux kernel...back in 2006.

Ubuntu has support from Canonical, so, if you have an issue, tech support can walk you through a fix.
 




Im posting no benchmarks because there's no point unless I'm doing a huge slideshow or something. It's there really a difference in telling you what it did in text versus a fancy dolled up picture?
 
OR, instead of embedding, You could post links bro. But, there is a much more serious problem at hand.

I'm convinced half your benchmarks don't even exist. :lol: And, if they do, they'd just prove my point further.
 


I do not trust Benchmarks unless for the following reasons:

1. It has been tested with at least one similar set of AMD and Nvidia GPUs. (i.e GTX 680 and AMD 7970)
2. The benchmark settings are absolutely the same.
3. They use similar AMD and Intel Mobo, same RAM, and same HDD/SSD. Maybe even throw in a universal Mobo.
4. The GPU and CPU stock settings are tested, similar clock, and/or max overclock (factory and absolute).
5. Do the combos of: Nvidia-Intel, Nvidia-AMD, AMD-AMD, and AMD-Intel.
6. There are at least 5 Benchmarks and/or Game tests.

* If you are doing an experiment, which a benchmark literally is, all the factors must be the same. As for computers, you must use Both AMD and Intel CPUs if testing with GPUs, and with CPUs, Vice versa with same Similar GPUs. AND, if your testing a CPU with a game, do max settings and lowest settings on Both CPUs. The lowest doesn't bottleneck GPU and really tests the CPUs. *

**HYPOTHETICAL**: The most simple thing to do this, is to make a company where you can get paid to do all the pain-in-the-ass testing. Then people wouldn't mind doing all this because I would think that at minimum, it would take 4 hours per benchmark experiment.

I would dig through a few sets of pictures to look for these kind of things, but sometimes different people/companies do things differently, so to accurately compile all this information, it would take a little while, say an hour or so. And I would love to do that, but usually I don't have the time as I have a busy schedule, and if I were to post them, even just the links, it would fill up somewhere to a whole page.
 


By your logic, then, none of your experiences matter by your own criteria. Because you cannot produce 5 independent sources verifying your own findings.
 


I didn't not literally mean that they all had to be in one test. And no it doesn't say that all my experience would be false. A real benchmark would be set in that manner of if it s a GPU, do your everyday Intel and AMD CPUs with it, and then with the CPUs, each with one GPU from each brand that's similar.

For example, most benchmarks out there are very informational and about 90% of them are right most of the time, however, if only one test is compiled say testing the FX-8350 to show how bad-ass it is, some people may take an i3 and compare it to that with an AMD GPU running the games, and that's it. Even if the two CPU's will be the same, if one GPU is used and the settings are only at max, then that would not be testing the full potential of the CPU and it wouldn't be 'Fair' so to say. And yes you can probably take that test and then find the exact same test but with a GPU from Nvidia instead of just an AMD, but even though the CPUs are the same and the GPUs are on the same level, there can always be completely different HDDs, SSDs, RAM, and Motherboards, so finding a test that is exactly the same is a pain the in ass.

I do like to look at benchmarks from people and see about an estimate of where its at. Say they ran a GTX 680 benchmarking test with an i7-3770k, I'll look at that and say, "Okay, it got 42.6 (random) Frames." So then I would round to the closest number and just use it for reference.

HOWEVER, if you want to get way into detail with all of these tiny little things like "My AMD got 0.5 frames more so it's better," or even something very general, but used in a detailed way, it's better to have something from ONE person that has multiple tests with the factors as close together as possible.

If you were to take a handful, say 15 or 20 benchmarks for the i7-3770k with the Nvidia GTX 680 running BF3, you'll see there will be at least 40 frame difference if not more, simply for the fact of: what is in the PC. The other major factor is, what are your settings and what are you doing in the game that effects the FPS.

The truth is that there are hundreds of factors and that even one small factor like cabling inside the pc or game file locations or even the Operating System, can have Zero to 10 Frame Difference.

For me to indefinitely prove that the Intel i7-3770k is naturally better than the AMD FX-8350 down to the definite answer, would probably take me more than 12 Hours Minimum to explain all of the testing and factors if going into detail. Now I can do all this testing and just sum up that the i7 is better and have a short 2 or 3 sentence paragraph explaining it, but no, the computer community as a whole want's a 100% definite answer, and sometimes not even the fan boys, and the increasing speed of technology doesn't really help with peoples attention spans and patience.
 


TL;DR.

What I did read of it was all basically an attempt to reneg on your original statement. So the "scientist" is contradicting himself because he's been painted into a corner...?

Look, just drop it...you cannot prove your argument...because I can provide 5+ sources showing the FX8350 is as good or better than the i7-3770k and that meets or exceeds your criteria for scientifically sound proof.

Your other criteria are not feasible under real world conditions without extreme complexity added into each review/article/benchmark. So for the sake of expediency, drop the shenanigans. The real world results have an acceptable margin for error...besides, you would likely find the i7-3770k and the FX8350 even closer under such scrutiny than the other reviews have them.
 


So you're telling me that just because you have those sources that says it's better? If you read the whole thing like a real Forumer would, you would understand a little bit more of what I'm saying.

Also you're saying that if I can find 5+ sources that Intel beats the 8350, your argument will still remain valid even though I found sources against it? This is what I'm saying.

With this, I've noticed that with your name and what your saying, you come from a background of AMD Fanboyism? Don't worry I don't really care, I just care for you in the aspect that you know right from wrong. Only YOU can't just be right you know?
 


You can't expect people to read through 6 pages of a forum to find the points you want to make, present quotes. I have been very active in many of the forum topics 8350rocks posts in. although yes, he does take pride in the 8350 he is not a blind fanboy, he presents evidence and accepts facts from both sides of the board.
 


And you're basically saying I don't? I mean yes there are a couple time where I mention a few things, like the 1ºC, I would not lie about, and I corrected myself on it, as it was idle. But otherwise, I'm giving valid reasons. And I meant him to read that last post I made, it was a few short paragraphs, any 5th grader could read through it in less than 3 minutes.
 


Honestly, I am not the one pointing at a 3 FPS advantage and claiming superiority. You are that person. I am not a fanboy, I call it like I see it...and $140 is not worth 4-8 FPS in my eyes when the samples are already well above playable FPS rates. By the way, screen names have nothing to do with the price of goats in africa either.

Further, you're talking about taking benchmarks objectively, I do that anyway. If the benchmark sample differences are inside margin for error I consider that a draw, and I have been saying that for 4 pages in this thread...(in case you didn't read the rest of it). Most people want to take a benchmark difference of 2 FPS with a MoE of 10% @ 120 FPS and claim "Intel R dah kingz of Urth!!!". Reality dictates that sample difference falls well within the MoE for the benchmark and dictates a draw. If you're trying to explain that to me, you're talking to the wrong man, talk to your intel fanboy buddies that see that and exclaim victory falsely.

Additionally, I am growing tired of debating the same points, and providing scientific and well documented examples for you to come along and say, "Yeah, but...". I am generally a very patient person, but you are wearing me thin.

Additionally...I was likely building computers and diagnosing issues when you were in diapers...so, I am pretty sure I have right and wrong figured out by now. The first PC I built had an AMD K6 266 MHz on a gigabyte board back when AMD and intel fit into the same socket(pre Slot-A). RAM in DIMM form was the "new fangled technology" and not at all common, IDE was the current HDD technology and 4 GB was the size of a HDD not a single stick of RAM (which was measured in MB then not GB). When I went to school for EE the term terabyte was thought of as some distant future prospect when robots ruled the earth. No one could fathom needing to store that much data outside of a huge bank of servers. I went to church with a guy who worked for Motorola that told me that the "new K7" was going to blow the doors off of anything intel was doing before the project got off the ground very far at all. (I could have made a fortune on AMD stock if I'd have bought at $4 and sold at $60...I still kick myself for that one).

So, please keep in mind the audience you're addressing. I know what scientific methods are, I have been around the block, and I have been fooling with computers in some form or another for as long as you've likely been alive or longer.

No, I am not always right, I concede when I am not. Though you have to understand, I have seen a lot of BS in the industry I am in, and frankly none of it alarms me, or surprises me anymore these days. So, when I see someone spouting things out that clearly don't make sense...I make an effort to correct it, because no one benefits from bad information. There's enough of it out there that I cannot fix, at least here I can make an effort to reduce myths, wive's tales, and speculation...or at least reduce their time in circulation.
 


I know that this has been said before and by several posters (including myself), but I will try again.

Older games are poorly threaded and i5-3570K usually wins over FX-8350. For instance a single-threaded game will be using a 25% of the i5 but only a 12.5% of the FX chip.

Recent games are more threaded. Both the i5 and the FX are in a tie on the average. See Gaming CPU hierarchy cart here at toms. They clearly state that if you own the FX and change it by the i5 (one tie above) you will not see average benefit on gaming because you need at least three ties above.

Future games will be heavily threaded. That and the close relation of the FX-8350 to the hardware in the next consoles makes the FX a better choice than the i5. All game developers polled by Eurogamer recommend the FX-8350 over the i5-3570k.

Resume:

Old games --> i5
Recent games --> FX or i5
Future games --> FX


P.S: You confound a fanboy with an enthusiast. He is an enthusiast. You are a fanboy. You have shown that you are one in several threads, including this.
 
I have to say, you did a very fine job explaining your self there. And I agree with what your talking about. The only thing that I am going to say is this. After looking over hundreds of benchmarks per GPU and CPU which total add up into the tens of thousands, and after the hundred articles, along with my years of experience, I still see no point in how the 8350 out trims the I7-3770k. 80 to 90% Of the benchmarks I read and after my own testing with the GPUs, I find that it is impossible unless you have very slim variables in which case other hardware allows certain parts to work better than others.

I am not saying you're wrong, but how do I know that your only really taking the AMD side and posting most things that benefit them? I'm not saying that for certain.

The other thing about this is that when I recommend CPUs, I go off of what they're doing, and that is the first question I ask.

After compiling this vast amount of memory and information from my head, it is true and definite that AMD computing is geared more towards video software, multi core apps, and any kind of arcade formatted game like CoD or Starcraft. On the other hand, Intel has a computation geared for heavy work loads, wider range of coded functionality, and benefits greatly when run on more simulated games like ArmA, FSX, and even BF3.

Also, many people disregard the true meaning of threaded apps. There's a big dark side to CPUs that of which is like the dark side of the moon. THREADING is the process in which a Processor EXTENDS its total core value. Now these new hypothetical/virtual cores can have more room to work with. This in these simulation games (which are much more threaded than your average game today) BENEFITS from this. The thing that HT will do is either a) create more cores in which that are 'virtual', or b) it will EXTEND each core as if it were much larger. Now, ArmA is a different story as the software for it takes advantage of more cores, BUT the coding for it didn't work when the CPU wants to extend the cores for even better performance than the separate virtual ones, but the game refuses that, and that's where you get the frame decrease. How ever, when HT is shut off (which is not difficult to do) it runs more efficiently on the Intel, even though the AMD still does kick ass with it.

Everyone says that AMD will benefit from threaded apps, which cannot be definite as of yet. Same deal with Intel as well. The future will tell if the threaded games will use certain code that benefit one over the other, but right now it lies in the hands of time.

As personal opinion based on fact, the games that will really only be threaded are simulation games since they need it. This with what the code is today for it, I can see Intel favoring over it. It also depends on game type. I.e, call of duty works better on AMD because it doesn't require as much code as say ArmA, and also the code and architecture of the game is more simplified and fits into the AMD Kernel better. However with the more complex units of code that come more in large chunks, this is usually where the Intel Kernel exceeds.

You can have a small wheel run X rpm for Y speed, and you can have a big wheel run less than X rpm but the same speed. Think of AMD as the small wheel and Intel as the large. They can both go the same speed but have different rpms. This is where AMD gets their high clock and great resistance from. They need.the extra speed to get extra workload done which in turn makes more heat which needs more heat resistance. Also the fact that a 8 core 8350 can work exactly the same as an I5 and not near as good as the I7 (in most gaming cases) instantly proves that each Intel core is much stronger than the AMD and can sustain them larger workload per core. In a workstation, video, or bit mining perspective, the AMDs cores are stronger for that in opposition. With those kinds of things the code comes basically more stream like rather than a rapid and as Intel is trying to Chuck up this code and get it out, that's where the Intel decreases in performance, while the AMD can easily with the more cores, can stream it out as like it came in. This is what happens with the multiple cores, which basically they are relaxing and minding their own business really.

Thus as explained above, if you are doing certain things, pick the cpu that goes with that. Also with tighter budgets, AMDs are the way to go in terms of performance for the more restricted amount of money.

This is what my mind had compiled for the most part. This took about twenty to thirty minutes to type as well, lol. You happy, can I carry on?
 


*sigh* The ubuntu benchmarks, which are less biased, show the FX8350 outperforms the i7-3770k in most categories.

I am not saying you're wrong, but how do I know that your only really taking the AMD side and posting most things that benefit them? I'm not saying that for certain.

I could ask the same about you...?

The other thing about this is that when I recommend CPUs, I go off of what they're doing, and that is the first question I ask.

But your recommendations are often counter intuitive.

After compiling this vast amount of memory and information from my head, it is true and definite that AMD computing is geared more towards video software, multi core apps, and any kind of arcade formatted game like CoD or Starcraft. On the other hand, Intel has a computation geared for heavy work loads, wider range of coded functionality, and benefits greatly when run on more simulated games like ArmA, FSX, and even BF3.

Your examples are backward...AMD runs a tight race in BF3 MP and Crysis 3, FC3, Metro 2033, Bioshock Infinite, etc. Intel wins SC2, Skyrim, and Civ5, CoD (except MP). Also, there is no code for intel that AMD does not incorporate, AMD includes all intel instruction sets; however,AMD has instruction sets which are not on intel chips because intel does not integrate AMD instruction sets.

Also, many people disregard the true meaning of threaded apps. There's a big dark side to CPUs that of which is like the dark side of the moon. THREADING is the process in which a Processor EXTENDS its total core value. Now these new hypothetical/virtual cores can have more room to work with. This in these simulation games (which are much more threaded than your average game today) BENEFITS from this. The thing that HT will do is either a) create more cores in which that are 'virtual', or b) it will EXTEND each core as if it were much larger. Now, ArmA is a different story as the software for it takes advantage of more cores, BUT the coding for it didn't work when the CPU wants to extend the cores for even better performance than the separate virtual ones, but the game refuses that, and that's where you get the frame decrease. How ever, when HT is shut off (which is not difficult to do) it runs more efficiently on the Intel, even though the AMD still does kick ass with it.

AMD does not have HTT; HTT != real cores. Intel has to divide resources to use HTT, AMD does not have to do that to run multiple threads. So AMD does not "extend core value", whatever you're trying to say. AMD 8 Cores > Intel 4 cores.

Everyone says that AMD will benefit from threaded apps, which cannot be definite as of yet. Same deal with Intel as well. The future will tell if the threaded games will use certain code that benefit one over the other, but right now it lies in the hands of time.

This is what bugs me, you concede above that AMD is better at multithreaded apps, but here you say it's not definite? As though the "thousands of benchmarks" are misleading? AMD + multithreaded apps > intel + multithreaded apps.

As personal opinion based on fact, the games that will really only be threaded are simulation games since they need it. This with what the code is today for it, I can see Intel favoring over it. It also depends on game type. I.e, call of duty works better on AMD because it doesn't require as much code as say ArmA, and also the code and architecture of the game is more simplified and fits into the AMD Kernel better. However with the more complex units of code that come more in large chunks, this is usually where the Intel Kernel exceeds.

AMD kernel? Intel kernel? AMD and Intel are operating systems now? Additionally, I would be willing to bet you the number of lines of code for CoD is more or less the same as the number of lines of code for ArmA. This is another issue I have, you're talking about things you don't know about, and try to make it sound like you do...but you don't. What you said makes zero sense...ZERO. I cannot understand any of what you wrote, because it is all self contradicting with references to misused technical terms you clearly don't understand.

You can have a small wheel run X rpm for Y speed, and you can have a big wheel run less than X rpm but the same speed. Think of AMD as the small wheel and Intel as the large. They can both go the same speed but have different rpms. This is where AMD gets their high clock and great resistance from. They need.the extra speed to get extra workload done which in turn makes more heat which needs more heat resistance. Also the fact that a 8 core 8350 can work exactly the same as an I5 and not near as good as the I7 (in most gaming cases) instantly proves that each Intel core is much stronger than the AMD and can sustain them larger workload per core. In a workstation, video, or bit mining perspective, the AMDs cores are stronger for that in opposition. With those kinds of things the code comes basically more stream like rather than a rapid and as Intel is trying to Chuck up this code and get it out, that's where the Intel decreases in performance, while the AMD can easily with the more cores, can stream it out as like it came in. This is what happens with the multiple cores, which basically they are relaxing and minding their own business really.

AMD modules are stronger than intel cores, by about 60% if the intel has HTT, if it doesn't they're about 80% stronger. Why do you keep referencing code? Code has zero to do with CPUs...other than instruction sets being processed. You're trying to tie 2 things together unsuccessfully and cannot talk about either one on a technical level.

Thus as explained above, if you are doing certain things, pick the cpu that goes with that. Also with tighter budgets, AMDs are the way to go in terms of performance for the more restricted amount of money.

This is what my mind had compiled for the most part. This took about twenty to thirty minutes to type as well, lol. You happy, can I carry on?

I honestly hope you understand how done I am with this conversation...every time you reply, I feel compelled to correct your bad information, and it's nearly driving me nuts trying not to do it. If I read another line talking about AMD's kernel, or Intel's code, my eyes might begin to bleed from the sockets...or my head might explode.
 
Buy the AMD FX 8350 and buy a better graphics card. If you buy a i5 your graphic card will have to be cheaper. And we all know what the most important part of a gaming pc is. The graphics card
 
Status
Not open for further replies.