AMD CPU speculation... and expert conjecture

Page 558 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished
@juan: we will see who is right when the time comes. Though I do not know if AVX3 will be in or not.

@szatkus: JK says they will be better by skylake. My source says even if they fall short of predictions they will be "on par".

As for limited resources, they had less when JK designed K7/K8, and HTX...so...what?

You all are so caught up in the R&D budget you are forgetting the PEOPLE. These are not your average engineers. Every project leader at team green right now is the equivalent to a Michael Jordan caliber player at what they do. If the bulls had paid MJ less money, would he have been any less proficient at playing basketball? No. I understand your concerns, but these guys are BRILLIANT! You can spend less on R&D when you know what works to begin with...
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160

Remember who is "lending" it out to AMD. I doubt Intel would allow them to be on the same stage with their SIMD cluster.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


You're completely right, but one thing. We don't know what is performance of Skylake (and probably also JK). I'm 90% sure that will be Haswellish +7-12% or less, but chances are that will be a huge jump in performance. Declaring win 2 years before isn't good idea, you could break some hearts of AMD fan(boi)s :)


Yep, like I said before I don't see anything wrong in properly implemented CMT.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I never thought I'd say this, but thank you. Every time I argue against low end Intel, I show the tech report FX 8350 review where Pentium G gets pretty much the same average frame rate as everything else in BF3, yet it does absolutely horrible in frame latency. And then all the pro-Intel guys who think every Intel CPU is amazing post average frame rate numbers and go "see how wrong you are!!?!?!?!?!?!?!?"

I've been looking for a review too forever. It shows Intel Hyperthreading really messed up frame time in a game on a somewhat modern Intel HT implementation (I do realize HT got better after Nehalem). It's Chinese and I can't find it anywhere.

It's sort of funny how those sorts of things never make it to larger websites, huh?





Skylake doesn't have that much to improve performance on. And, the big elephant in the room regarding Intel is that every generation since 32nm, they lose maximum overclocking potential. Before 32nm, they were pretty consistent at 4ghz. But, it's not really maximum overclocking potential, it's a chip built on a process that simply doesn't clock as high.

Meaning that if this trend continues, Intel will continue to lose clock speed, and eventually it could get bad enough for them to have to lower clockspeed on stock products on the high end.

People don't want to admit this. When Haswell and IB (both 22nm products) didn't clock so well, every Intel guy jumped up and blamed the TIM. A few delidded and had good results (but not all of them), so everyone just assumed it was only the TIM.

And then Intel released Devil's Canyon and everyone was disappointed.

Unless my memory is wrong, 32nm SB was good for 5ghz, 22nm IB was good for about 4.5ghz, and 22nm Haswell is good for about 4.5ghz.

Are my numbers about right? I want to do some math on them and make some guesses as to where 14nm clocks would end up but I want to make sure we're all in consensus here.

source for SB needed!
Haswell: http://www.overclock.net/t/1411077/haswell-overclocking-guide-with-statistics 4.53ghz average
IB: http://www.xbitlabs.com/articles/cpu/display/core-i7-3770k-i5-3570k_9.html

If Intel loses 10% maximum clockspeed potential with each die shrink, we are looking at 14nm Intel parts that don't make it past 4.05ghz on average. if AMD could keep clocks how they are now, they could end up with a 10% to 20% clock speed advantage over Intel.

I don't have high hopes for 14nm Intel as far as HEDT goes. I do have high hopes to see what Intel fanatics blame the poor clock scaling on this time. Perhaps we are going to go back to a time when Intel breaking 4ghz is a rare treat?
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I wouldn't be surprised by a clock speed reduction. They have means to counteract that though like doubling the L1/L2 sizes (rumored). I still wouldn't expect legacy code to run much faster. You really need to take advantage of AVX2/FMA or AVX512 to see the massive performance gains. The same was true for Haswell.
 

jdwii

Splendid


How wasn't again? On the server end the 16 core BD could barely beat magny-cours while using more power.
 

logainofhades

Titan
Moderator


Which is the whole problem with Faildozer. Clock for clock it was worse than K10. To be honest, piledriver wasn't exactly a big leap from K10 either. Its ability to hit higher clocks and more cores are what was able to make it push ahead. A 4.2ghz FX 4350 is less than 10% faster than a Phenom II X4 965 @ 4.0 in games, where as a 3.9ghz FX 6350 is nearly 20% faster. Most likely, the gap would be similar with an FX 6350, vs a Phenom II X6 1100t, at similar clock rates as they are for the FX 4350 vs the PhII X4 965.

Faildozer and its successors are AMD's P4. Pentium 3 was superior to Pentium 4 as well at similar clock speeds. Hell, even a 1.2ghz AMD Duron was faster than, I believe, a 1.6ghz P4. :lol: Intel learned from their mistake, and made core 2, which shared more with P3 and its mobile successors, than it did P4. It sounds like AMD is also learning from its mistakes and starting from scratch.

I honestly hope they pull off a winner of a chip like core 2 was. I do not hate AMD, but have been horribly disappointed in them lately. I want to see true competition again, like we had before Core 2 spanked K8. The K7/K8, PIII/P4 days were exciting competition wise. AMD got cocky and were caught with their pants down, when core 2 came out, and have never really recovered. Intel was just as arrogant before P4 got spanked by K8.
 

8350rocks

Distinguished
If you want to see competition, stop telling people to buy ridiculously overpriced intel garbage and buying bargain bin other parts when they could have a nice AMD system for the same money with much better other components...or significantly cheaper with the same bargain bin parts...
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160

You should not favor one company over the other. You choose the hardware that suits the best for your needs and your budget.

You would realize AMD would increase their pricing aswell if they got more marketshare?
If you do not like the build he suggested, you are always more than welcome to suggest a build yourself and reasoning for why he should go with that.
 

logainofhades

Titan
Moderator
Overpriced garbage, really? :lol: I don't spend my money on inferior products when I have the option to buy better. I don't need fluff like modular PSU, or and SSD, or even a fancy case. They are luxury items and nothing more. I care about performance for my dollar. None of those things will increase my FPS, at all. I don't desire a feature filled motherboard. All I care is if I am getting the performance I desire. I would be happier with a 1230v3 and an ASRock H81M-DGS R2.0, than I would an overpriced space heater of an FX 9xxx series, or even an FX 8350 with a 970a-ud3p. So would my power bill in the summer trying to keep the room cool. I don't hold allegiance to AMD or Intel. I go where the performance/$$, and longevity is. For me, that is an Intel CPU, with an AMD GPU. Granted I want to replace my HD 5850's with GTX 750ti's, but that is only because their power consumption is so low, while still be decently faster. If AMD responds with a similar card, I would be more than happy to go with one. Power consumption matters to me as well, at least for my file server. It has to stay Intel due to the Raid 5, for now, anyway. I have had my share of AMD rigs. I still own a couple now. But I really cannot recommend FX anymore, unless you happen to live near a Microcenter. Their combo deals are impossible to ignore.






 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160


I could only imagine it having a smaller impact on some games gameplay (we are excluding loading scenes).
 

con635

Honorable
Oct 3, 2013
644
0
11,010


http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1158
No intel bashing for this though?

 

here's the problem with parrots:
http://wccftech.com/amd-launches-locked-kaveri-a10-7800-apu-30w-tdp-7850k/
a10 7850k, 7800 have gcn 2.0 igpu, apparently.

anywho, i am interested in how amd will design the new gpu and the asynchronous compute engines.
 

logainofhades

Titan
Moderator



I called Haswell, Hasfail, for months, as it wasn't a significant leap over Ivy. Hence why I am still running a 3570k two years after I bought it. I don't recommend Ivy much anymore, unless it is one of those too good to pass up deals, or it is to upgrade from say an i3 sandy or ivy chip. I almost always recommend a 1230v2 to those folks. Often I have argued against starting over with Haswell for those still on a Sandy or Ivy platform that can be upgraded to a 3770 or a 1230v2. Prices are close enough now, that buying Sandy/Ivy make little sense for a new rig, though. When Haswell was first released, the $40+ premium wasn't worth it. Motherboard supply and prices have stabilized since then and are similar to what a 1155 board would have costed you when 1150 came out. I am still wanting my Kabini HTPC, even if I don't really need it. :lol:
 

8350rocks

Distinguished


I would like to point out, I was talking about i3s and Pentiums...Intel's bargain bin junk is just that...junk. The FX6300 is vastly superior to any Intel CPU within $20 of it one way or the other.
 

8350rocks

Distinguished


R9-280 coming with a new GPU codename Tonga.
 

logainofhades

Titan
Moderator



FX 6300 is not superior to an i3 4150 without an overclock, which kills the value. Not to mention the limited upgrade options.
 
Status
Not open for further replies.