Is AMD FX Still Viable For a System Build? Rev. 2.0

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


The theoretical wasn't an issue, for me. It was the actual power consumption, that resulted in more heat, making it more difficult to cool the room, in the summer. On the flip side, it sure was helpful in the winter. :lol: It was always noticeable when my FX 8320 was fired up. The room always seemed to get twice as hot, vs how it was when my two i5's were running.
 


I hope that you are joking. The FX 8 core systems do run hot, but no more so than an i7 when overclocked. In fact a processor like the i7 4790K has a TJ Max of ~100C whereas the FX 8 core processors have a TJ Max of only 70C, and both processors run hot when overclocked. In fact when overclocking an i7 4790K temps at 70C or under are what are targeted, when overclocking a FX 8 core system temps of 60C and under are what are targeted - in short an overclocked i7 runs hotter than an overclocked FX 8 core.

Yes the FX 8 cores tend to use more power than their Intel counterparts and more power means higher heat however the power consumption is literally ~$2 a month maximum. As far as the heat there is no way an FX 8320 can actually heat your room unless you are literally sitting in a closet with it and even then your own body heat will raise the temp more than the computer.
 
I am a little skeptical that you can feel the difference in the amount of heat generated in a room by 2 different processors (unless the room is very tiny and has poor air circulation). A 75 watt light bulb will give off more heat than a 60 watt bulb, but can you tell the difference? Plus, other components in the computer also give off heat, so isolating it to the CPUs in 2 different rigs would be difficult.

I'm sure that a physicist or electrical engineer could give us a formula to determine the therms created by 2 different wattage CPUs over a specified time interval, but my guess is the difference wouldn't be significant enough for the human body to sense.
 
I said i5, not i7. There was always a noticeable difference, when the FX 8320 was fired up, compared to when I was running my i5 machines. I could run my main 3570k rig, and my i5 2400 rig, and it was always cooler, in that room, than if I had my main and the FX running. I probably didn't explain what I meant, very well, earlier. SNB chips were known to run cooler, due to them being soldered, vs the crappy paste they used afterwards.
 
On the electric bill argument:

You won't notice a cost difference a 55w CPU and a 125w CPU. Your toaster, oven, microwave, light bulbs, AC, and your everyday appliance generate a much higher cost, yet no one complains about them as they would a difference of 75w in CPUs.

On the heat argument,

Your GPU generate much more heat than your CPU. Switching to an iGPU would produce a more noticeable result.
 


+1 for the build quality of Sandy Bridge chips (soldered) vs the later Intel generations. They were of much better build quality, could usually overclock better and stayed cooler. Intel cheaped out on the later generations overall build quality.

However still not convinced of a FX 8320 actually increasing the temperature of a room as I have a more powerful chip at high overclocks that is venting cool air out the exhaust fans unless under load and even then its only warm air (not hot like a hair dryer or something) would never be enough heat to actually effect the ambient temp of a room.
 
I don't think it was the GPU, causing the difference, as they were identical cards. I had 2x HD 5850's, in crossfire, and moved them to my 2 other systems, when I replaced them with an HD 7970. The i5 had better cooling too. I was running an old thermaltake all copper tower cooler. It was able to handle my 3.6ghz Kentsfield x3210, so the i5 2400 wasn't even a challenge. My FX 8320, was bumped to 4.0ghz, using an old Zalman 9500, that a friend had given me. It was better than stock cooling, but that chip always was a hot one.

Granted my main is now folding, but it always warms up the room it is in. I sold my other 2 systems to my nephew.
 
Anything from 970 and 390 up, will use over 300w at load. On the CPU end, however, even the most extreme Fx 9590 tops out at 350w, the 8370 tops only at 180w*. Furthermore, the CPU is far less likely to run at maximum heat and power than the GPU. Thus, making GPUs a top contributor of heat and power consumption.

I would look at the GPU end of things if heat is an issue.

*correction: The 8370e tops at 184w, while the 8370 tops at 204w.
 


Actually, the GTX 970 uses about 150-160W in a typical gaming scenario. The R9 390 uses about 250-260W. At full load its about 200W for the GTX 970 and 365W for the R9 390

 
My experience has been somewhat different. I have a Fx-6300 @ 4.3 w/ r9 290 ref., Fx-8370e stock w/ r9 280, and Athlon x4 760k @ 4.6 w/ r9 280 in the same room, about 80 sq ft. No noticeable different in temp with all 3 rigs on and off.

Only time when room temp gets noticeably hotter is when the r9 290 starts to blow heat out playing Witcher 3.
 


I am sorry, you were right. I misread where it says total system in watts 😛
 
I very much love my FX-8320 / R9 270 / 8GB DDR3 2133MHz system. The CPU was a hell of a steal when I bought it in 2013 for $135 CND, when the cheapest i5 was $185 CND. Now, sadly, there isn't a single i5 for less than $230 CND and even the FX 8xxx series is over $200 CND...

Anyhow, I chose the AMD FX processor as an upgrade to my Core2 Quad Q8200, because it soundly trounced that CPU in every possible scenario. People tend to ignore how AMD includes every CPU feature, such as hardware virtualization, in all their CPUs, with all their processors, where Intel does not. The FX is a well rounded, competent CPU for all workloads. Is it the best? Nope. Is it the worst? Nope.

Some of the things I do with my FX 8320:

Linux:
- Blender Game Engine, 3D modeling and C++/Python based game logic.
- Audacity audio editing.
- Sunvox multi channel synth and music tracking program (amazing program btw).
- C++ programming and compiling with 8 cores using GCC (more than twice as fast to compile the game I was building back in 2013!)
- GIMP graphics editing, textures, etc.
- Virtualbox running 2 or 3 SWGEmu servers (Debian 6 and 7) at times in a Windows 7 host. Not all the time, just when programming, but the experience was always smooth and uneventful.

Windows:
- Games such as Guild Wars 2, Planetside 2, Star Wars the Old Republic, and many others run quite nicely at 60 FPS with 4x MSAA or FXAA, with shadows on low or medium. Generally speaking, I am kind of snobby when it comes to graphics (if only EQ2 didn't look and play so awful...), but what I appreciate most is a smooth frame rate as much as possible. My setup, on my 20" 1600x900 monitor, delivers this for me and it did so on a tight budget. Sure, the frame rate drops when too many people are around in PS2 or GW2, but the same thing happens to overclocked i7s so... yeah.

I don't regret my purchase at all. Not because I am a "fanboi", but because the reality is that the system "just works". Given how I don't really play games all that often anymore, I probably won't upgrade for 3 or 4 more years at least, provided the system doesn't blow up, etc. lol...
 


My experience has been that the heat is barely noticeable at all. walking from room to room in my house my computer building room (where my PC is) is always the same temperature as the rest of the house. I do a lot of video editing and rendering on my rig so the CPU is usually getting a good workout. My Sapphire R9 290 is overclocked heavily and I play an assortment of games including Wither 3 and Fallout 4 on Ultra settings, so again should give entire system a good workout. I have never walked into the room and felt it warmer than the rest of the house. I think a lot of it may be due to overall cooling set up as well. I have 8 120mm cooling fans in my case - 5 pulling cool air in from the bottom, front, and side of the tower and 3 pushing "warmed air" out the top and back of the tower. I also have a NH-D15S with a three fan configuration all in push pull. The NH-D15S with its dual large towers dissipates a lot of heat and with the 8 120mm case fans and tri-fan cooling configuation on the NH-D15S the air being blown out of the case is just barely warm, never feels hot even when at full load for extended periods of time.
 
Nice, I only have 3 120mm case fans with a 2 - 1 negative airflow, a Cryorig H7, and the ref. blower on my 290. I tinkered with the fan settings so that my 290 never throttle or even reach 80c. I don't like the stock 95c heat for the GPU.

Anyhow, heat isn't an issue in my room, and I live in the South.
 
 
I will say this about AMD FX CPUs in Canada these days:

http://www.newegg.ca/Product/ProductList.aspx?Submit=ENE&N=100007670%20600213781%20600436886%208000&IsNodeId=1&bop=And&Order=PRICE&PageSize=30

The regular, non-sale prices are at least $20 higher today than they were in 2013.

FX-4300 was $109.99 and is no longer available.
FX-4350 was $119.99 and is now $139.99
FX-6300 was $134.99 and is now 153.99
FX-8300 is currently $179.99 and was not available back then.
FX-8320 was $165.99 and is now $209.99!
FX-8350 was $189.99 and is now $244.99!

For reference, the cheapest Intel i5 was $184.99 and is now $264.99. An i7 was $299.99, but now we have to pay $404.99... and that's just Haswell, Skylake is even more expensive. Crazy, eh?

Honestly, at the current prices in Canada you really are better off to buy an Intel i3 or i5, because the FX CPUs just don't have the same budget value that they used to.

I know that our dollar is low, but hard drives, solid state drives, and ram are all cheaper today than they were back then. Cases, power supplies, and monitors have stayed about the same. It's only CPUs and video card that have gone up in price.

Note: I picked up my FX-8320 for $134.99 on "Black Friday" in 2013. The "was" prices I gave are from my nerdy research at that time. :)
 
He probably is and it certainly does.

That said, he's right inasmuch as 99% of the world's computer using people are perfectly happy with standard systems. Only the tiny minority with little more to do than play games make all this fuss over that extra FPS or sqeezing another Watt out of the electricity supply.

I so hope that doesn't sound insulting.
 


+1 not at all insulting and totally agree. I've never had a person whom I've built a FX rig come back and say its "garbage". I always do an interview with a person before building a system, if they are the type that always needs the best of everything and the newest latest toy then I almost always build an Intel rig for them. If they are just your standard everyday computer user I build the best performance / cost system and that usually (but not always) ends up being AMD.

Most people who are wanting to get on PC for gaming always ask for a PC with the same performance of a Xbox One or PS4. That always makes me laugh as FX 6300 and i3 6100 with proper dedicated graphics will far outperform either of the console systems. Right now the FX 6300 total build cost is usually $30 -$40 cheaper so it comes down to if they will be doing other things other than just gaming such as video editing ect... Nothing but gaming the i3 6100 is a great value, but gaming and multi-tasking, running multi-threaded applications the FX 6300 is still better.

I also build a lot of FX 8370(E) systems for local college students who are on a tight budget. They are always impressed by the workstation performance of the processor and are usually surprised that their total build cost (even when overclocked) is on par with an i5. Although the i5 will have great gaming capabilities it doesn't have nearly the workstation performance (for heavily multi-threaded applications).

Their are a lot of people out there that were talked into much more powerful rigs than they actually need and will ever use. Case in point is people I deal with all the time that have powerful Intel gaming type builds on 720p and 1080p 60Hz monitors and are perfectly happy with the performance. What they don't know though is they would be getting the same performance (with their current monitor) on a usually much cheaper AMD FX build.
 
720p rigs, I generally say i3 and GTX 950 is the max really needed. FX6300 can be a pretty good choice too, if the price is right, like in the case of having an MC close by. The FX 6300's stock, all aluminum, cooler is horrible though, so a $20 Logisys Ice Wind Pro would be a must, or a coolermaster TX3 if case depth is an issue. Problem is that puts it back within about $10-15 of an i3 6100, with an H110 board, vs the FX 6300 with a Gigabyte GA-78LMT-USB3, and cooler. Getting the cooler and FX 6300 is cheaper than getting the FX 6350, which comes with the better cooler, like the FX 8320 comes with.

The 750ti is generally enough, but I prefer to go up at least one tier, for longevity sake, or in the event they move up to 1080. Really hope AMD's next gen makes the midrange more interesting again. Rebadged cards, from 4yrs ago, have become boring. I pretty much ignore all AMD cards below the 380, unless price becomes an issue, where the 370 makes sense.
 


Not at all. For one thing, there's nobody to insult, unless silicon has feelings. It was merely a joke about how the typical person just checks email and does all that stuff on their computer. It was a joke, lighten up :)

I wasn't calling AMD the "dookie of a horse" - I was referring to something even older and worse than AMD's FX line that would still work.
 

TRENDING THREADS