AMD FX-8370E Review: Pulling The Handbrake For More Efficiency

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I had the same thought. I'd like to think they didn't so there's something left on the table to improve upon.

The -E processors are Piledriver with the power usage generally under control, assuming retail and review samples perform identically. It's sad that it's taken two years to get something that might perform close to an SB i5 for similar power usage in heavier workloads; still, it's no Steamroller FX, but would that even help outside of the one area that FX is already relatively strong?
 
"New" 32nm cpus in 2014... Honestly, what is AMD thinking? Broadwell will send them bankrupt if they continue with the now almost two and a half years old Piledriver microarchitecture. I know they brought some improvements to it, but it just can't compete with the 22nm and, soon enough, 14nm Intel products. Even the mobile chipsets are moving from 28nm to 20nm. Come on, AMD, we could really use the competition right now.
 
"New" 32nm cpus in 2014... Honestly, what is AMD thinking? Broadwell will send them bankrupt if they continue with the now almost two and a half years old Piledriver microarchitecture. I know they brought some improvements to it, but it just can't compete with the 22nm and, soon enough, 14nm Intel products. Even the mobile chipsets are moving from 28nm to 20nm. Come on, AMD, we could really use the competition right now.
 
While I don't mind the presence of AMD gpu's (though I still prefer nvidia), AMD cpu's are not very good and either they should start making major strides or just give up on it.
 
This was a very nice, well-rounded review.

I wish AMD would have sent some variation in motherboards, both higher and lower end. But, I think the motherboard chosen was a good middle-of-the-road; it did have its quirks.
 


No down vote, but I'd like to clarify a bit. I'm not a fanboy by any means, and if I sent that vibe out that's not what I meant to do. I actually had some buyers remorse before I had my rig all set up as I opted for the cheap 8320 ($140 at the time) over an i5 3570K so I could step up my GPU and have a decent Asus board. Now that it's running I'm ok with it as it's much faster than my old rig. If Intel had an i7 that was in my price range for a new build and it had good performance for it's cost, but had a TDP of 140 Watts that would not deter me at all. The only CPU's I think are too much are the FX 93xx as there is no affordable way to cool them lol.

All that out of the way, this is a reasonable CPU option for those who care more about efficiency or are Mobo TDP limited and want to upgrade from a Bulldozer CPU, but I still think it costs too much for it's performance. AMD will do the most damage right now (Piledriver) in the price to performance aspect of things, not flat out speed and efficiency. This one is approaching i5 territory for price and the i5 is faster. In my humble opinion, they would have been better off retiring some of the older models and sending the new ones in.
 


the problem is to many or most of review site "only" test or benchmark new parts with highest system they have..
(when they test proc they use higest GPU they have 290/780ti, when they test gpu they use i7 only)

Nothing wrong with this, their intend or method is correct..
(to know how good the new parts without bottleneck by other parts in the system)

but they seem forget/neglect to bench/review using mid level system to promote balanced build..
It could be they lack of time, fund, or simply don't care...
it's good for people who already informed but it's not educating for new people..

edit: in other thought
"maybe this is a reason why I'm waiting/trust for Tom's review more than other site review 😀"
 
"""Depending on the game in question, AMD’s new processor has the potential to keep you happy around the AMD Radeon R9 270X/285 or Nvidia GeForce GTX 760 or 660 Ti level.

A higher- or even high-end graphics card doesn’t make sense, as pairing it with AMD's FX-8370E simply limits the card's potential."""

Wow, what a horrifically misleading over simplification of the issue. I expect better from Toms hardware articles and reviews than this. This is the sort of poorly derived philosophy you can scrap from the side of the forum barrel filled with amateurs.

The CPU should be selected based on the FPS one wants to get in the type of games and conditions someone wants to play, period, the GPU has nothing to do with this as no amount of GPU big or small can solve a compute side problem for performance.

The GPU should be selected based on the VISUAL QUALITY (that includes resolution) one wants to play at, (factoring in the desired FPS). This part of the component selection has nothing to do with the CPU.

Match the CPU to the compute workload. Match the GPU to the render workload. Any philosophy that attempts to match the CPU to the GPU, or vice versa, is fundamentally flawed and I am ashamed to be reading this sort of drivel on what is supposed to be one of the most highly regarded hardware review sites around.

The R7 250X, R9 270, and R9 290, will all achieve approximately the same FPS when connected to 720P, 1080P, and 1440P resolution displays respectively, all other things being equal. Note, the size and cost of the GPU here basically quadruples from the 250X to the 290, yet all 3 configurations produce the same FPS with all other conditions being equal (same CPU). The thing that changes with higher end GPUs, is the visual quality available at the desired resolution. If the goal is 60FPS at 720P, or 60FPS at 1440P, the compute requirements of the game won't change much, you'll have to pick a CPU that can do the desired FPS in the game and conditions intended regardless of what GPU is selected.

Any CPU can be an appropriate match for any GPU if the CPU has been selected to fulfill the compute workload presented by the game and FPS desired, and the GPU has been selected to fulfill the visual quality and FPS desired. If the goal is to play on a 4K monitor at 30FPS, then you'll need a flagship GPU for that. Fulfilling the compute side of the 30FPS goal requires nothing more than a $75 CPU for 99% of games out there. A 750k makes a good match to an R9 290 for such a goal. Conversely, if the goal is to play a compute intensive multiplayer game at a competitive 144FPS, then an overclocked i5/i7 is the only CPU worth consideration regardless of which GPU is chosen. The render workload per frame is adjustable, so any competent GPU could hit the 144FPS goal with proper settings in most games, the GPU "size" has nothing to do with the CPU selected, it has to do with the desired visual quality.
 

I agree. Couldn't they negotiate the following agreement with all the companies they do reviews for?
1) Buy a review sample at retail
2) Benchmark it
3) Send it back to the company to have ~80% of the purchase price reimbursed (can't expect the company to pay for retail margins), or have it reimbursed without sending it back.
 
Wow, what a biased review. The benchmarks don't show a lot of difference between the Intel overpriced dogshit and the AMD CPU's. Tomshardare does it again.
 
Is it me or is Tomshardware is an Intel fanboys? They do it again in this article.

"Jee, AMD might have sent us rigged hardware to review" and " Jee, the CPU performs as good as overpriced i7 CPU's from Intel in most relevant scenarios, but jeee, it's AMD so it's bad cuz we've been saying this for every CPU article we write".
 
I have an i5 3570K that I bought Sept. 2013. It runs at 4.5ghz at 1.26v. It runs at that speed as low as 1.22v I paid $180 for it and $30 for a XigmaTek Gaia 1283 cooler. At the time my only other option was to go Haswell but it was new and the OC's were not looking good at 4.2-4.5ghz. I didn't worry about the 1155 chipset was ending to the 1150 as I knew the new DDR4 was around the corner with new chipsets.

I am not an Intel fanboy, I just buy what performs best. I have a desk drawer full of old AMD CPU's back to before Barton and After Barton with dual core AMD 64's X2 3800+ and a couple others.

 
Old stuff on obsolete chipset made efficient with out of mind price. Change chipset, update production node, architecture. Return to 4 cores.
The i3 pwns in gaming, i5 has the same price.
 
I really can't blame AMD for introducing it at $200 when there are so many out there who still insist (in defiance of all evidence) that the FX-8xxx's are comparable to Core i5's. May as well get that $200 for a few weeks or months while they can, but price drops are inevitable later on.

If those people want to pay $200 for FX-8370E's, I don't mind. Their sacrifice helps keep AMD around.
 
Well considering Intel spends more money on R&D then intelligence makes per year and Intel tick tock system it's will be a while before AMD will be in direct competition unless they design something truly revolutionary. Intel is set on a path and buying up any company that develops something new. So AMD need to strike while people are complacent.
 
Well considering Intel spends more money on R&D then intelligence makes per year and Intel tick tock system it's will be a while before AMD will be in direct competition unless they design something truly revolutionary. Intel is set on a path and buying up any company that develops something new. So AMD need to strike while people are complacent.
 
The FX-8320E for $140 seems like a great deal to me (at least, that was the price recently on amazon). It may not have the raw execution resources per core to compete in real-time workloads, but will still make a fantastic workstation chip at that price point. Keep in mind that benchmarks of real-time workloads and benchmarks of applications workloads can't be held to the same standard as they effect us differently. Real-time workload performance is something we actually notice in terms of user experience (gaming), while application performance is harder to pin down in terms of how it "feels." Whether a task finishes sooner or later is actually not as relevant as whether the machine stays responsive to our real-time input while working on multiple things at the same time. In this regard, a $140 FX-8320E is better suited to a typical multi-tasking workstation environment than a competing i3-4350. Especially when you consider that the FX-8320E supports ECC memory (on many inexpensive motherboards), and IOMMU's.
 


A real-world benchmark is any where the results represent performance that is practical in some way; that includes games and applications. The opposite is a synthetic benchmark, like Cinebench, 3DMark/PCMark, Fritz, etc.

A $140 FX-8320E might represent better performance per dollar, but professional computation is not really a "value-oriented" market. Companies who need performance workstations are happy to invest in $2000 Xeons because they still easily make their money back. Even individuals who need beefy workstations for their livelihood are going to get a better return from a Core i7 rather than faffing about with -8320E's.
 


Considering how many of those $2000 Xeons are slower in real world single client workloads than a $250 E3 Xeon, Any company who is throwing that sort of money at the problem without an assessment of the workloads generated by the user isn't going to last long.
 
Status
Not open for further replies.