AMD CPU speculation... and expert conjecture

Page 206 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished


You left off the caveat...I fixed it :)
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010
I need not say it again and again: Intel is out to make their CPUs run better and more efficient.

Yes I know Haswell (Hasfail as most call it), and I agree, that is kind of a step in the wrong direction. However, when is there not a release of a new product that requires tweaking and tuning to make it right? If everything worked the first time, our economy wouldn't be in the shit hole, and there would be more world peace.

Along with this, yes the APU is fantastic since it is "all this power for a low price", I agree. However people need to realize. What is the power of it? Equivalent to ~ a FX-8350 underclocked to 3.5, and a AMD 6750? Yeah, it will be good on price, and even with the newcomers, similar to how duncan3303 (Austin) on youtube made a $350 rig with the A10-5800k (which is like the 6800k but with 6570 GPU levels). However he did mention, it will probably run Battlefield 3 on Medium settings (No AA), Low HBAO, all at ~30 Frames.

Now take this into consideration. Would you rather buy FX-8350 with a Radeon HD 6750, or a A10-6800k? Obviously the APU since it is MUCH cheaper. However, the power that you have in that PC, will probably last you no more than a year if you want to run games close to High settings.

This is where the gimmick is in place. People fail to realize the obvious.

And 8350, that drug I'm taking? It's called knowledge, you should definitely not take that and take it away from your kids.

EDIT: Also as a last mention. I know it's a bit pricey, but who has the strongest CPU? Yeah its called "The Intel Extreme Series". I know it costs more money, but if you have the money, its totally worth it.
 

hcl123

Honorable
Mar 18, 2013
425
0
10,780


There is no chip yet ARMv8 64bit, except for Applied for micro-servers, in the real world... and there the tests like with IBM are for costumers "internal" consumption.

But i believe the ARM architectural licensees for 64bit can do much better than the ARM simulations indicate, and this include the serves guys including AMD, Qualcomm, Nvidia and Samsung ( i think)... and perhaps Apple will join not much longer (depends on how much bags of money Intel puts on top of the table lol).

I think its inevitable ARM will spill in force into the mobile world including Notebooks, but in this market sector benchmarks are quite harder to conclude anything, too much difference in platforms and subsystems. So "real world" benchmarks of ARM 64bit will be quite harder to extrapolate... the ISA is different, the software is completely different( a relax memory model)... and is the software that "commands performance" NOT the hardware ( for the hardware(edt) would be enough to know the GIPS and the GFLOPS potential and the CLOCK ), *believing* (its a believe) otherwise is an abhorrent state of denial.

Want REAL high performance, demand REAL high performance software, multithreading an vectorized and what not, the hardware guys just can't do miracles.

An old example with the "SAME EXACT HARDWARE" and 1500% gain only by the software side tweaking of the "SAME APPLICATION".

http://www.edn.com/design/systems-design/4314689/Autovectorization-for-GCC-compiler
EEMBC publishes two types of scores: out of the box and “full fury,” or optimized. The organization obtains out-of-the-box scores by compiling unchanged source code, and it obtains full-fury scores by changing the source code to improve performance but still follow the EEMBC rules. In most cases, the changes engineers make to the code enable use of vector/SIMD instructions that compilers were unable to do automatically. The full-fury scores show an average improvement of more than 1500% over out-of-the-box scores for the same processor running at the same speed

This is the HPC world in action... but it wouldn't have to be that much, NOT everything is highly paralelizable or vectorable in the client/desktop world, but i think 100 to 200% is quite possible for a couple of years ahead with more cores chips ( yet ppl seems content discussing 10 to 20% instead :??: )

So benchmarks concern... if you control their code you can present any results you like... its representative of its own SOFTWARE mostly, and most of time like with synthetics, hardly any representative of any "real world" software. Is what it is running that software, but don't go there with *absolute* extrapolations, because with *other software* running on the same processor(s) at the same speed the results can be diametrical opposed.

And for the windows world the perspective is worst, like shown by the Win8 and 8.1 versions, the infrastructure is moving into the "cloud", having powerful workstations is moot in the usefulness department to say the least.
 

jdwii

Splendid


Go by stock clock not the turbo also go by the competition such as the 8 core fx vs i5
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


the i5 vs the 8350 at stock clock is ~ equal performance. Just goes to show, with the Intel at half the cores and at less clock, that is a LOT of core power. Imagine that in a 8 core i5 (hypothetical), it would destroy. And the problem with the price of some of the processors is the Hyper Threading. The i7-3770k is literally an i5-3570k with HT on it. If it were just an 8 core CPU then it would be ~$400-500 but you would have insane performance, kind of like a dual LGA 1155 with two i5's but in one CPU.
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


My 4770K uses 179w at 4.8Ghz under Cinebench so i'd full expect an 8 core version to use double that at the same speed. It cant get here fast enough
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


Its not even going to be close.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Thats the thing the centurion probably takes 500-600w from the wall with motherboard etc at 5ghz amd. The 8 core haswell will probably score 16 points on cinebench 11.5 and take similar electricity to an fx4300 or fx6300.

Are not you tired of writing the same off-topic anti-AMD nonsense in all forums (including outside of toms)?
 

hcl123

Honorable
Mar 18, 2013
425
0
10,780



In truth go see past posts, around 170w average(extremesystems)... no chip LOL no chip at all will "effectively" draw 600w without setting on fire and get to smoke... power is a function of temperature and mobo VRMs... (gee... why i waste my time !?)

In fact is just a little above the 3970x

Just better comparative is IBM z196next, its ~600mm², that is, the size of 2x 3970x ... runs at 5.5Ghz and consumes average <250W for 24/7 all year for many years(can't burn out or fail its mainframe stuff)... so consumes about 80% of that 2 intel chips(is less power than 2 intel extremes).. yet performs perhaps about 5 times better than those 2 extreme intel chips (about 10x better performance of one extreme on windoze), comparing with windows software.

Why ? ... because everything is fine tuned compile-install( at 1 million per top config machine it ought to be lol )

and an example ... that title is hilarious "comes close to to 4770k" LOL... propaganda never stops, it must be a great business to fool ignorants... he must mean like this...

http://www.phoronix.com/scan.php?page=article&item=llvm_clang33_3way&num=4
 

montosaurous

Honorable
Aug 21, 2012
1,055
0
11,360




Yes I'm tired of fanboys spewing hate and lies too, but this is their battleground essentially.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


I'm sorry, and no offence. However, I cannot take you seriously:

a.) Please type in an eligible manner
b.) Reread and refine what you are talking about.

Totally agree. If a CPU and board took 500-600w out of the wall, expect to call the fire department. I could see something like 200 watts, maybe 250 watts at the absolute maximum. If you're going to be a fan boy, please learn your information. Thus far you seem like one of those Xbox fanboys who say, "XBOX IZ BETTER!111!111!1!!1!!1" Then I say, "Well, then tell me why?" Answer: "Uhhh, BECAUSE IT ISSSSZZZ!11"

This is a serious thread, I hope I don't have to give you another example again.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


You seem to reckon a lot of things. I agree that the FX takes much more power than the i5 or i7, however please fix your statements.

This is full load on stock clock and voltage (4.0 Ghz):

LL


However, this is the full load @ 5.0Ghz with a voltage of ~1.5 like you said:

900x900px-LL-57164783_power-3.png


You are right that it's higher, but it wasn't that high. Just like I predicted: "200 maybe. 250 Maximum".
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Be specific/unambiguous.



And after being shown to be completely wrong about power consumptions, now return with price for another anti-AMD attack...
 

hcl123

Honorable
Mar 18, 2013
425
0
10,780


Sometimes i wonder!..

My point capable of raising the back head hair of all "intel fanboys" is that the BD/PD chips are already better than intel ones.(edt)

Ok! they will shove me in the face me a lot of charts... i accept ALL of them (do the same with all showing)... only i point its a SOFTWARE effect, like the link i point showing a FX8350 >600% better than a 3960x, is a "software effect".

Clear example, 1500% better due to software alone( hardware doesn't count here) (edt) !...
http://www.tomshardware.co.uk/forum/352312-28-steamroller-speculation-expert-conjecture/page-104#11103697

Ppl never appreciate the difference, they are content discussing 20% in "WINDOWS WORLD", never giving a second thought why it is so. In the old days the "benchmark industry" never was as developed as it is today, it suffice to know the clock and the MIPS of a chip to know its potential, because ppl knew that if the potentials are higher, compile-install and tune the software usually the performance is better.

Perhaps was because there wasn't the social network infrastructure as is now, discussions were much more technical in nature, today most ppl seem to not have a clue what they are talking about, propaganda is rampant, its all about the "commercial" aspect... they are simply being duped.

So many have shown charts where intel chips perform much better, i shown charts where AMD performs much better... where do we stand ?... is it the hardware ? ... NO!

Technically is what it is, only the discussions are polarized about emotions, not reason. If you had a clue, you wouldn't post what you posted, perhaps all you care like most is "WHO WINS" at something, like if it is a soccer/football game, never giving a second thought who plays better and is a better team, and or if the referee was biased and influenced the result ... in soccer/football, even the best team who plays better can lose, and or lose at some game against a competitor yet be the clear champion, trace in this aspects with "software effects".

Hope this helps. And yes, if you didn't understand, i posted that AMD is better than intel... so what ? ... "clock" matters, number of real "cores" matters, scalability of uarch design matters... does it make me a fanboy ? ... or is it because of the more verbose, nobody ever compiled a piece of software in their lives ?

And no, i don't use windoze... and if you ever did, don't remember, don't bother showing me anything of that awkward world again, its NOT for me... understand ?

 

8350rocks

Distinguished




*ba dum tish*

:rofl:

You guys are great...the jokes keep coming!!!
 

8350rocks

Distinguished


200-220W CPU + 350W HD 7990 + everything else = ~ 600-700W peak power draw, So yes a 1000W PSU would be good to keep the PSU from running full load at any point...but then again, an Intel machine with a HD 7990 would likely have a 1000W PSU as well.

I fail to see your point...

Many high end gaming machines have 1000-1200W PSUs, and those typically have CF/SLI HD 7970/GTX 680 setups...most of the power draw comes from the GPUs.

Look, just because you run around concerned about battery life in ultrabooks and laptops doesn't mean the rest of the world is...in fact, you might be the only person I have ever met that is so concerned with power draw. I would think you must live in the Amazon rainforest 1000 miles from anything with a gas powered generator for electricity with the way you're concerned about it.

Move on already please...no one cares about your power consumption argument. Many have already expressed as much multiple times in this thread alone.
 

hcl123

Honorable
Mar 18, 2013
425
0
10,780


And if you thought i'm "EMOTIONAL" polarized against intel and MSFT( edt)... according to what i shown(1500% better remember ?)... usually those "game console" guys are able to extract much more performance form their hardware than in the PC world, because they sponsor coding "their games" much more close-to-the-metal, and is not because its AMD hardware, if it where Nvidia+Intel hardware would be the same thing. ("software effect" again, software rules)

So don't be surprise if some of those games, that many "IGNORANTS" parrot the hardware is mediocre compared with the best of PC world, ends up having games performing better than in PC world, in fluidity, quality of image and FPS, than in PC world with a 670 or a 7970.

( c'mon do you think MSFT and Sony have engineers that are stupid, that they chose middle to entry level hardware compared to PC world, only because they are seeking a debacle ? ... LOL ... but yes, perhaps someone (the engineers or most ppl) is being very stupid about the all verbosity that circulates about the next crop of consoles!..)(edt)
 

8350rocks

Distinguished


Wow...I am somewhat amazed...an objective post. I didn't think you had it in you...

Now, I may have to readjust your position on my list, you come down one notch, you are now 1 step closer to the "light side of the force" than hafijur.
 

hcl123

Honorable
Mar 18, 2013
425
0
10,780


You forgot double or triple monitors for some of the more adventurous... and even double SLI or Xfire setups are not that uncommon...

And some even have non LED monitors in double and triple setup... only those big monitors in U-HD resolutions and action games, consume much more power than any CPU... yet since they bought a "green" CPU... that no matter its TDP rating, in games at those high resolutions are already drawing 200W... they think they are "green" LOL ... what a bunch of s***** gits!

I mention multiple monitor setup because it doesn't pass by the box PSU, the "mind conditioning" the "brainwash" is such that they can't see anything else... "" my electric bill went roof high, it must be the CPU"" LOL... ( the best is to avoid such discussions, not even with a baseball bat could be resolved, its cases for straitjackets lol)

[ The same playing all night long during winter with the electric heating " pumping high" and the sound going true a surround "home theater system" ( i have a friend with that)... in the end the culprit for power waste is the CPU LOL ](edt)
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
Intel fanboys are completely ignorant of the integral role instruction sets play in a CISC ISA. I have always wondered what would happen if I were to do an FX 8350 review where I only use FOSS software, go full AVX/SSE/BMI/etc for the AMD and then leave the Intel running x87.

It is probably the easiest way to gain the attraction of Intel PIE and rabid fanboys who don't read reviews (let alone understand anything beyond a bar graph).

However if you want my opinion on the whole 4M/8T Steamroller FX, they're going to come. AMD's big Opterons have been bleeding market-share left and right, they simply can't compete with Intel there because PD is just so power hungry compared to what Intel offers.

But, take a look at typical AMD market-share in the general market, and then compare it to Steam hardware survey.

16.7% general AMD CPU marketshare:
http://beta.fool.com/leokornsun/2013/04/02/this-underdog-is-just-another-dog/28650/

26.7% gamer AMD CPU marketshare:
http://store.steampowered.com/hwsurvey/processormfg/

5.5% Opteron marketshare as of 2011:
http://www.eetimes.com/author.asp?section_id=40&doc_id=1286510

From a business perspective, I would think that the best things to do would be:

1. Stop competing with Intel directly in the server space. Compete by offering things Intel can't (microservers, HSA APUs, etc) and abandon the fight for a strong x86 CPU

2. Create a good gaming CPU that synergizes with 8 core Jaguar APU to continue the higher than average marketshare in gaming PCs

3. Make the new CPU compatible with AM3+ to get the budget gamers who want a CPU upgrade and went 3M/6T CPUs without buying a new motheboard (these guys are budget gamers, they don't have tons of money to drop all the time on new hardware)

4. Abandon raw x86 performance fight in mobile and aim for perf/watt and GPU performance.

1 is what stands out regarding AMD's server roadmap the most. Instead of trying to turn server chips into desktop chips (like they normally have been doing), it would make sense for them to start with gaming chips as that's probably their strongest x86 market right now and then if there's demand turn those into server chips.

2 is important as I don't see AMD squandering a chance to have software optimized for their CPUs after having a life of ICC and crippling AMD via software.

The rest aren't as important, but I do see the first ones making sense. AMD actually grew a little on steam hardware survey. Even though the growth is small, it's the only market AMD is actually growing in right now, which is PC gaming.
 

8350rocks

Distinguished


:rofl:

Now you're trolling...the GTX Titan draws 350W...the 4770k draws 138W...that's 488W.

Even the 9590 + HD 7750 (which runs on PCIe power no additional power connector) would draw about 300-350W (What the Titan draws alone).

Go back to your gas generator in the rainforest, we don't care.
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


Do you just make numbers up as you go? :lol:

Titan - 250w
 

jdwii

Splendid


Don't show that i love Amd video cards and that makes me think Amd sucks in performance per watt. Luckily all i really care about is Price/Performance
 
Status
Not open for further replies.