AMD CPU speculation... and expert conjecture

Page 214 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished
What BS are you posting now? Seriously...did you even look at the graph?

The Vertex and the Vertex 2 are nearly the same performance...there is slightly higher throughput on the Vertex 2, but it's nothing to write home about...at all.

When you can read a benchmark, come back. The write speed on the Vertex 2 is much improved, but that has nothing to do with reading data.
 




Lets throw the BS out and AMD Steamroller speculation in. Thttp://www.techpowerup.com/186793/engineering-sample-of-amd-steamroller-based-apu-spotted.html
 

8350rocks

Distinguished
A system making database calls has NOTHING to do with write speed...you probably have no idea how a database works. Will it effect the benchmarks some? Maybe...but not that much. Write speed deals with putting information on the disk, not recalling the information and doing calculations.


 


Hajifur, you have made far more of a fool out of yourself. Go away, seriously, we are tired of dealing with people like you who ruin the purpose of our thread. The Vertex2 should not really cause a massive performance boost due to write speeds, as pointed out by 8350, it is irrelevant. If the mods are going to do anything, it is likely going to remove your posts. Read my rant and also cowboy's rant and listen to answers. Stop posting nonsense and understand the purpose of the thread.
 

8350rocks

Distinguished
THEN GO FIND AN ATOM BENCHMARK FOR POSTGRE SQL THAT "BLOWS THE DOORS" OFF THE 8350!! DON'T COME BACK UNTIL YOU FIND IT.

EDIT: Let's take a look at the rules for this discussion, shall we?

We have had several requests for a sticky on AMD's yet to be released Steamroller architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Our previous sticky concerning Piledriver can be found here:

NOW PLEASE, GO AWAY!!! THIS ISN'T HASFAIL DISCUSSION THREAD!!!
 


Unless you need to write the data to disk because some other thread needs access to RAM. So paging could be coming into play.
 

jdwii

Splendid
Hafijur is lucky i'm not a mod,

He keeps saying things that make no sense at all and all he does it through nonsense about power consumption when it would take 10 years to even make the difference up between a I5 and 8350 by that time most people would of already changed their board and CPU 3 times and least people who buy those class of CPU's.

Its not that i'm in denial either i know Intel is around 30% faster in Performance per clock and that they do have probably 15-30% better performance per watt when your CPU is not on idle. On Laptop APU's they barely consume any power my laptop sets at 15-20 watts most of the time(30-40 when gaming) and that is a Llano A8 i'm sure Richland is better. For desktop gamers they're not too crazy about performance per watt that is mainly a concern for the portable market.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


you should take your own advice.
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810
Ok, seriously this is getting out of hand. Benchmarks are basically little better than an experts opinion in the first place. They can be influenced by so many variables that its not even funny. I'm sure given enough time and tinkering with a benchmark test to totally screw over product A and enhance the results for product B that I can produce a benchmark showing my smartphone "destroy" my Phenom ii 965 BE. Does that mean that my smartphone is more powerful? Of course not. Anyone else like history? Interesting little tid-bit before WWII France was "benchmarked" as having the worlds most modern best equipped and trained army. Interesting that Germany didn't look at the "benchmark" and instead beat the living daylights out of the British and French. Good thing for the UK that Japan made the mistake of attacking us or the UK and France would be speaking German right now.

A little off topic, but it proves my point. Synthetic benchmarks are unreliable at best. I personally like to use benchmarks of actual high end programs ie video games. The Crysis 3 benchmark shows FX 8350 isn't too far behind the best Intel has. I use Crysis 3 as I believe it is the first of an entire generation type and will be the kinds of games we have in the future. In fact when playing the vast majority of games side by side you can't tell which system is the AMD and which is the Intel (FX 8350 vs i7). Even with Intel's favorite game Skyrim. I have seen those test done before, and both systems run the games at the same levels any advantage to one or the other is negated by the fact that the human eye is imperfect and can't process the difference in FPS. That is something that Intel fans like to totally ignore though. Unless you are programming on your computer or doing some other extreme workloads (that really require the extra 20-30% performance per clock) with it there really is no reason to pick the vastly more expensive Intel over AMD for the common household computer that is used to game on. That is where different experts say we have reached the "good enough" threshold. Any more than the "good enough" is a waste if you aren't using it for a profession that requires it.
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810
In the words of Ace Ventura REEEEAAAEEAAALLLYYY!! A dual core with mid rage gpu can play Crysis 3 on Ultra settings in the same league as an FX 8350 with high end gpu. Man you are a little troll haha!! Lets define "fine" if it can't be played on the highest settings possible than its not "fine".

As far as Skyrim goes, you are seriously trolling. I only have a Phenom II 965 BE and Skyrim is one of my favorite games. I have many mods I wrote myself as well as about 50 mods from Skyrim Nexus all running on Ultra settings in Skyrim. I have no problems at all, and using FRAPS the lowest FPS I've seen is 58 (when loading an area of civil war using the civil war mod- loads like 50 - 100 npcs in a large battle at random points). My regular computer monitor is a 22" LED at 60 refresh so I have a 60 FPS limit for my system. Have to hook it back up to my 1080P 55" LCD again and crank it up to see what max FPS I'm getting there. For a four year old "inferior" AMD processor (with high end AMD GPU) thats not bad!! I know the FX 8350 is vastly superior to my Phenom II 965 BE, so yea (if you have a high end GPU) even with Skyrim your not going to "see" the difference unless your running FRAPS and looking at the numbers.
 


Uhm... Actually it was Russia who made the Germans retreat. The US went in after Russia had it cooked, but was fast enough before the Germans started speaking Russian. Hence "The Wall". After WWII Russia took half Germany and the Allied forces let Germany re-build themselves on the other half.

Anyway, can I haz ignore button? 8)

Cheers!
 

jdwii

Splendid



The more you go on the more you sound irrationall. 1 Amd APU's are amazing on laptops from the 350-650$ market. My friend at college bought a I5 ivy for a laptop and it games ok but it skips way more then my laptop does and for the same price he could've gotten a A10 trinity at the time he regrets it every time he plays a game(lots of college kids play games on laptops and need one for 600$ or less). Also like you to know that its still not that big of a difference in price per year, it would take 10 years to make a 30$ difference between a I5 and 8350fx and most of the time desktops and laptops idle. So for the last time 90% of people can care less about power consumption a desktop as long as its not to bad(140+ watt CPU's)
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Please keep in mind, even though these consoles have "All this power," which is ~ like an underclocked i5-3570k @2.8 Ghz and a GTX 650 ti, Battlefield 3 is supposed to run @ 60 FPS on PS3. Keep in mind:

- This game uses a new engine, and even though it may be 'optimized' it would still need serious power, also considering Frostbite 3's completely new destruction mechanism. Thus being said, for an i5 @ 2.8, and a GTX 650 ti, that game would need some nerfing in order to get it to run that fast. Similar to now, how BF3 on pc looks like the PS3 version if you put the settings at low, but HBAO on and running 720p, meanwhile it will run Much, Much faster. Also resolutions. Battlefield 4 on consoles is still going to run at 720p. For a game like that to run 60 FPS, 1080p, with amazing quality on the PS4's hardware, is just a fantasy.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
I keep checking availability on this Kabini board but doesn't look like they're shipping yet.

KBN-I/5200

KBN-I_5200.V1_AMD_CPU_Onboard_3.jpg


http://www.ecs.com.tw/ECSWebSite/Product/Product_Detail.aspx?DetailID=1497&CategoryID=1&DetailName=Feature&MenuID=106&LanID=0

 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


So it was Russia who supplied all the materials of war the UK needed to fight the Battle of Briton? And it was Russia who launched the D-Day invasion and liberated France? Interesting history books over there!! Haha!!

Anyway was just using Frances over hyped "benchmark" as an example of how far off they can be.
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


So it was Russia who supplied all the materials of war the UK needed to fight the Battle of Briton? And it was Russia who launched the D-Day invasion and liberated France? Interesting history books over there!! Haha!!

Anyway was just using Frances over hyped "benchmark" as an example of how far off they can be.
 





Gah, think! To the secret, improvised ignore button!

 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Huh, lol, I can never catch a break with you. They are science experiments in limelight, treat them like the should. Use all the same variable as possible. Even with your "percentages", that's still the problem at hand. And I forgot to mention, why not use AMD and Nvidia drivers on say a 7970 Ghz, and a GTX 680 / 770? Then it would be even more fair. If you take it into reality, each test is ~ scored differently, thus, you using percentages would have to be altered according to the value of each test. So the easiest way to do that is count, per test, who won. I believe that there are 18 tests? And the Intel side won about 12 of them. Okay now lets do this:

18 + 100%

12 / 18 = ?

The answer is: 0.6666666666666667. Round that up to 0.67, then move the decimal: 67% I wasn't far off with my estimated percentage of 70%.

Now the 3970X won about 7 of the Intel's ones, so now 12 = 100%

7 / 12 =?

Answer: 0.5833333333333333: round = 0.58 + 58%.

For a total: 7 / 18 = 0.3888888888888889: 0.39 = 39%

AMD: 6/18 = 0.3333333333333333: 0.33 = 33%

Now, in conclusion:

Total:
Intel- 67%
AMD- 33%

Split Down:
i7- 39%
i3- 28%
FX- 33%

So the i3 lost by a little bit, but look. at. that. It seems that the 3970X won.

If you want to make a big deal about it and try to make AMD the top dog against anything, especially with that wimpy energy sucking 8350, then find some false information. Please stop posting nonsense.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


So talking about an Intel CPU (COMPUTERS) is off topic, but a UK war isn't? Give me a break. Seriously.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
I am sorry to inform you that there is not news about SR besides the last link I gave about the ES kaveri benchmark.

But I would like to reconsider SR performance. I assumed that SR would be about a 30% faster than PD. In that case my estimation was that 4C SR would be so fast as 4C IB approx.

I have found that AMD claims that a SR core will be 2x faster than JG (Jaguar) core. Previously in this same thread I gave some benchmarks with a 4x1 JG being so fast as a 2x2 SB. Assuming everything scales up fine

8x1 JG ~ 4x2 SB

Taking AMD claim 1 SR = 2 JG we obtain

4x1 SR ~ 4x2 SB

I.e. a 4C SR will be about as fast as an i7 SB (CPU only).

The previous estimation using another route was

4x1 SR ~ 4 IB

I.e. a 4C SR will be about as fast as an i5 IB (CPU only).

I think both estimations agree rather well. A 4C Steamroller CPU would be close in performance to an i7-2600k or to an i5-3570k (approx.)

What do you think?
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810
"So talking about an Intel CPU (COMPUTERS) is off topic, but a UK war isn't? Give me a break. Seriously."

I was simply making an analogy about how off "benchmark" scores can be in the real world, outside of a lab and synthetic tests. I actually didn't mean to stir or rial anyone up. I find it funny though how little credit the US is given for its role in WWII, but that is not the topic, so unlike the Intel fan club here I will drop it.

Now getting back to Steamroller, I would love to see a 30% improvement over PD, but realistically what does everyone else think it will be? Seeing as how AMD and Intel both over-estimate the improvement that will be made I'm thinking it will probably be 20-25% improvement over PD. Using that as a figure it probably won't edge i7 4770K in performance, but should come somewhat close. When considering Price/Performance it will look even better.
 


Speaking as a moderator, the thing you don't see is that it would be about 200 pages full of garbage if we didn't step in and delete several pages at a time. Keeping something on topic is like herding cats. Or in my line of work, like trying to get Baby Boomer doctors to touch type in the electronic medical records (hunt and peck with two index fingers is considered "excellent" for them) or the docs' patients to quit bugging them for narcotics. Try as you might, it's NEVER EVER EVER going to happen. You just replace the stock $2 Dell/HP membrane keyboard for Ol' Doc with a 25 year old IBM Model M that his jackhammering with the right 2nd digit won't faze and let the Haswell/Hasbeen/Hasfail/Faildozer/etc. talk slide until you can get around to pruning it out.



I used to have both of those systems, or close enough. I had a cast-off 1993 Mac LC2 with the highly crippled 16-bit bus on a 32 bit 33 MHz 68LC030 and a "smack it to fix the short" rainbow Apple logo CRT monitor. I played SimCity 1.4 and 2000 and the original Civilization on that sucker and enjoyed it more than any modern game. It was a real lucky day when you managed to hit the right boot key codes to disable enough stuff to wind up with the magical 2714K of free RAM to run SC2K. And then you better save every five minutes lest you freeze up. I had an IBM 486 too, but it was a 50 MHz 80486SX2 rather than the much faster 66 MHz 80486DX2. It ran OS/2 Warp 2.0 too so I had the extreme fun of running my Windows 3.x stuff through an early 1990s emulator. I learned really quick not to right click as it would freeze the system rock solid.

That old stuff sure was unreliable crap but it still seemed more fun than current gear. A 64-bit, quad CPU setup with tens of millions of KB of RAM like I have now would have cost a nice house "back in the day" and been as finicky as a Ferrari to keep running. My current setup is more like a two-ton truck. It's guaranteed to run even if you run it three quarts low on oil and have two flat tires.



Hey now, I have a 6234. It isn't hugely fast in single-threaded Windows benchmarks due to its 3.0 GHz all-core Turbo speed but it will certainly demolish any dual-core laptop CPU in any remotely multi-threaded, especially on an OS that isn't a 64 bit version of a 32 bit shell for a 16 bit OS originally written for an 8-bit processor on a 4 bit bus by two-bit company that can't stand one bit of competition. You don't buy a G34 system to run single-threaded stuff. You buy an FX-9590 if you want to do that.



Nah, don't shut it down, upgrade from 120-volt "dinkyplug" to proper 240 volt circuits. You get twice the wattage on the same wires and also get access to much higher amperage draws as well. 120 volts pretty well tops out at 2.4 kW (20 amps) while you can run hundreds of amps on a 240 volt branch. I regularly steal the 30 amp/240 volt dryer outlet to run the 3 hp cabinet saw in my small excuse for a basement woodshop. I simply have a homemade 12-gauge extension cord with a NEMA 14-30 plug on one end and a 6-20 receptacle on the other end. It works great and my 3 hp saw can cut anything I throw at it, unlike a dinkyplug 1.5 hp unit which bogs down on anything thicker than 4/4 with a proper full-kerf blade. You can do the same with computers especially since modern PSUs can run on either 120 V or 240V. However, if you really want to play with the big boys, they are currently using 4160 volt medium voltage setups to their racks like one of the places I work. There's nothing like seeing "DANGER 4160 VOLTS" on an incoming conduit to get your attention in an indoor electrical room.



Okay I am a truck guy so I will go way off topic. Your airbagged and chipped F-350 isn't an F-550 even if you put on stacks and a smoke switch to roll coal at passing Priuses. (Which is very fun by the way.) The F-350 has a frame rated for 14k GVWR max (most are only 9550-11000 lbs) with a Dana 60 monobeam up front and a Sterling 10.5 out back with a maximum of 4.11 gears, connected to 17" rims with LT tires. The F-550 has a frame rated for 19.5k GVWR, a Dana Super 60 front and a Dana S110 rear with 4.88 gears connected to 19.5" rims wearing medium duty truck tires. It's like a single processor desktop versus my quad CPU server in server tasks- it just doesn't stand up. However your F-350 will be much friendlier with the DOT fellows as its lower GVWR will allow you to pull a higher GVWR trailer and fly under the magical 26k GCWR CDL limit. Unless of course you have a Class A CDL, then the mere mortals rules do not apply.
 

Phew, a mod came, we set up a secret thread as a last resort. Loving your rants.
 
Status
Not open for further replies.