AMD CPU speculation... and expert conjecture

Page 616 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
moar fx 8370E reviews.
anandtech
http://www.anandtech.com/show/8427/amd-fx-8370e-cpu-review-vishera-95w
techreport
http://techreport.com/review/26996/amd-fx-8370e-processor-reviewed
pcperspective
http://www.pcper.com/reviews/Processors/AMD-2014-FX-Refresh-FX-9590-FX-8370-and-FX-8370e-Review

edit: radeon r9 285 reviews are out too.
http://www.tomshardware.com/reviews/amd-radeon-r9-285-tonga,3925.html
http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed
http://www.techpowerup.com/reviews/Sapphire/R9_285_Dual-X_OC/
according to the tpu review, power use still jumps during blu-ray playback. i was hoping it'd go down this time.
 

8350rocks

Distinguished
Just a question...

Where are all the "PPW IS KING!!?!?!" now that Intel is launching a 140W product, no one is complaining about space heaters and what not...but that is exactly what it will be. Especially considering Intel runs hotter than AMD normally...
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160

Because you actual got a difference other than clock speed.

It is not a core I7 4790k clocked higher.
 

.....integrated pcie gen 3.0 controller with up to 40 lanes, quad channel ddr4, 8 cores, 20MB max. L3 cache(2.5MB/core), dubiously integrated embedded vr (die vr, an acronym i just made up), all within a 356mm^2 die, packing over 2x transistor than fx8350 (1.2B on 315 mm^2 according to a.t.). according to tom's review, seems to use same 125w (on avg., max. 128w) as the fx8350 at 4 GHz discounting loss to ivr. but, the comparison isn't exactly apples to apples as we don't have a 22-28nm amd cpu (SIGH, do something, amd.). if toms data is right, then intel still holds the ppw lead in those benchmarks at 4.0 and lower.

hopefully intel won't backtrack on the transistor count. but who knows.... :whistle:
 

truegenius

Distinguished
BANNED


my reaction as soon as the music started :mmmfff:
mrw-a-tumblr-page-has-music-playing-and-no-controls-to-turn-it-off-77425.gif


i should have read that it is from 1984 :whistle: , much before than my birth date :p
 

Morales2142PL

Distinguished
Jan 8, 2010
37
1
18,540
So is AMD going to produce any CPU's in the future? Or are they done and they will only produce APU's for now. I want to know because i always had an AMD cpu and they were great but if i will upgrade in the future from the FX-8350 then it will have to be AMD, i won't go Intel unless it is necessary.
 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360


I believe Coolermaster supplies the box coolers for AMD and I agree. They package 95W/100W TDP products with coolers I can only assume were designed for a fraction of that.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Oddly enough, Blender's awful performance in Windows has to do with MSVC. It is compiled with MSVC and it does have all the flags set for generic SSE and AVX. And if you get a mingw build on Windows it will still blow the MSVC one away. I went over the whatever the equivalent of make files is for MSVC for blender and I had a nice discussion with a developer in IRC about it. There are loads of other builds on graphicall.org as well as plenty of other users in Blender forums who see huge gains by ditching MSVC.

From my experience, you have it flipped. Optimized MSVC for Blender is absolute trash and GCC destroys it. And yes, I saw same results on both i7 920 and FX 8350. So it's not just a "MS is crippling AMD thing!"

And oddly enough, glancing over LAME source code the Windows port is also MSVC based. And I saw huge gains going from MSVC official build from LAME website to custom GCC build. MSVC is not a good compiler at all from my experience, for Intel or AMD.
 

8350rocks

Distinguished


Yes, AMD will produce more HEDT CPUs in the future...that much is unquestionable.
 


The last time I saw significant GCC vs MSVC benchies was in early 2013. At that time, MSVC was about 10% faster then GCC, when both were using optimum switches for a given processor arch. And this was across a couple dozen benchmarks, not just one. Not saying the situation has changed, but its not something that's benched often.

EDIT

I should not a late version of GCC 3.x.x was tested, so I wouldn't be shocked if GCC caught up in the 4.x.x branch.

And it goes without saying, ICC wins even without optimizations.

So basically:

Performance: ICC
Portability: GCC
IDE/Debugger: MSVC

Whichever is important to you drives which one you use.
 

logainofhades

Titan
Moderator


It will just be awhile before we see them. AMD will be more difficult to recommend until they do.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


AMD is transforming itself into a SoC company

AMD_Transformation_575px.png


During last reorganization the former CPU business and GPU business lines were fused. This seems a consequence of APUs being now main business for AMD not CPUs.

However, we would still expect a new line of high-performance CPUs for year 2016--2017. There are unconfirmed rumors about 8/12/16 CPUs based in the new K12/Zen architecture.
 

jdwii

Splendid


How so when the FX 8370 series is competing with a locked I5 at 3.2Ghz? I tested this over and over again in newer games the 8350 wins and in older games it loses but both are playable. What makes more sense for a future gaming build?
Then the 6300 is going to be priced at 99$ and for 30$ you can get a 212+ cooler and OC that 6300FX to 4.4-4.6Ghz easily and you can do that for the same price as an I3 which is a horrible choice for gaming over latency issues. Not to mention for modern apps the 6300 easily wins anyways.
Then we have the 8320fx which is priced at the mid-range I3 CPU's i have little doubt in my mind that when a user plays modern games or uses modern applications the 8320fx will come out on top before overclocking.
Now I do agree with you over the fx 4300 and fx 9000 series I would not recommend these do anyone.
 

no. according to the article, larger number of nvidia cards are compatible with gsync while amd is making select new radeons compatible with freesync.
 

jdwii

Splendid

"By contrast, Nvidia's G-Sync works with GeForce graphics cards based on the Kepler architecture, which include a broad swath of current and past products dating back to the GeForce GTX 600 series. "

I now see that but still more Nvidia cards today will work with technology tomorrow sucks so many people still own a 7970Ghz edition and older series gpu's while the 290-285-260x-7790 is not even that popular. Plus isn't this technology even more important on weaker cards that can't do 60FPS as easily?
I guess the end game is what will be cheaper in terms of monitors for new customers? If their video card already supports gsync it might make more sense to just spend 100$-200$ more on a monitor then a new video card even more so when you keep your monitor longer than a video card.

Edit the more i think about it the more i think Free-snyc will win it all since it will become the next standard it just sucks for Amd now
 

logainofhades

Titan
Moderator


For a similar price of an FX 8370, 990fx motherboard, and aftermarket cooling, I could get a Xeon 1231v3 and an H97 board. Same goes for overclocked FX 6300 rig vs a locked i5. Only time I really ever recommend FX, is if on a budget and near a Microcenter.
 

jdwii

Splendid


The truth to the matter is you don't need a 990FX board or a 8370 when the 8350 is 180$ or a 100$ cooler when a 212+ will do it just fine. I can't even find that CPU but Intel claims its 240-250$ WAY more than the 180$ CPU or 150$ 8320fx. And its LOCKED to that speed as well. That extra money can easily be put in a video card where it matters more. A fx 6 core will not bottleneck many if any new titles coming out and if it does a I3 would be bottlenecking the build even more. Again the 100$ CPU is not competing with a 240$ CPU or 190$ CPU its really that simple
Perhaps you don't live in the United States?
Its rare i meet someone who doesn't have a budget for a gaming build most of the time its around 600-800$ with some 500$. If however they had unlimited money to spend on a build i would recommend the best.
 

jed

Distinguished
May 21, 2004
314
0
18,780


Don't get beside yourself thinking you can snatch a victory from jaws of defeat, The FX8350 or
FX9590 is on par with the i7 2600k not the 3960X as you see here.

http://www.headline-benchmark.com/results/d2e23042-10b6-47ed-94c3-a0fee6e31b15/c309c1f4-6f00-4def-8127-b8d679fe5989

and here

http://www.headline-benchmark.com/results/d2e23042-10b6-47ed-94c3-a0fee6e31b15/403b1068-88fd-42e7-a0a7-8c9ae4d57e7f

and also here

http://www.headline-benchmark.com/results/9a411141-526d-4452-8e78-1c9891c2efa0/c309c1f4-6f00-4def-8127-b8d679fe5989
 

8350rocks

Distinguished


This.

You can buy FX8350 $160.00 + M5A97 R2.0 $90.00 + CM 212EVO $30 for what mid range i5 + MB costs, and the 8350 will be a much better buy.

Now, anyone want to talk about the FX 8320 @ $120.00? 8 Core overclockable CPU w/MB + HSF for ~$240.

How much was a mid range i5 without buying any other components again? ~$200-220?

Then you get into the FX6300, which literally will get you great performance for $100 ++

That should seriously move some silicon.

@jed: my 9590 is not overclocked yet....I have literally done nothing to it at this point.
 

jdwii

Splendid


Well i always use techpowerup i find them to be the best over them using a LOT of games and doing averages. Check this out man
http://www.techpowerup.com/reviews/Sapphire/R9_285_Dual-X_OC/23.html
Uses around 250 watts while the 770 uses 220 and the 760 uses 167 watts. This is a major issue for their APU series its not that big of a deal for dGPU's though(although it turned me away).
 
Status
Not open for further replies.