AMD CPU speculation... and expert conjecture

Page 300 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Nvidia has been leading the "GPU price creep" for a while.

It was the entire purpose of Titan. To set an astronomical price on a single GPU card, and then to release GTX 780 as "LE" version of Titan at a price much higher than what high end cards would normally be.

It worked wonders too. No one went "hey, GTX 780 is $150 more than what GTX 580 cost!" they went, "Oh this is a great deal for $650 I get a Titan that costs $999!"

I suspect this is what AMD is doing with FX 9590 and we will see AMD release desktop parts to fill in the $200 to $499 range which will look like absolutely fantastic values compared to FX 9590.

It is a marketing strategy that works absolutely fantastically. Intel does it too. 3930k would look pretty expensive, but when you compare it to $999+ 3970x which is just a little faster clock speed and more cache, 3930k suddenly looks like a great value.

It is evil and it works so well on people that it isn't even funny.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I have actually discussed efficiency. Improved efficiency means that you can obtain the same performance consuming less power or that you can obtain more performance consuming the same power.

 

lithium6

Honorable
Feb 5, 2013
28
0
10,540


The problem is that you bought an overkill of a PSU because you got it cheap. The Seasonic G360 is over 91% efficient at the wattage you aim for: http://www.silentpcreview.com/article1297-page3.html 

By getting a big PSU cheap you will waste about 4% more electricity, that's more than 8W! 

Back to Steamroller: I seriously thought that AMD was about to unveil Athlon III FX for FM2+ when they hyped the 10 year anniversary of FX. All we got was a picture with two logos side by side. Thank you.
 




Windows does a LOT of stuff under the hood that 99% of people don't care about. But since MSFT aims at mass market appeal, they include those features by default anyways. As a result, a barebone Windows install requires a lot more to run then a barebones Linux install.

As for OGL, remember Value is basically using a D3D to OGL shim to get support up and running. As a result, I'd suspect a lot of games are going to lose some graphical features, simply because some stuff does not port over to OGL well this way.

BTW, OGL is a HORRID graphical API these days. Doesn't even reflect how the HW actually works anymore. The entire API needs to be replaced as far as I'm concerned. That's part of the reason no one targets it anymore.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Valve already did, it's a feature of SteamOS and it doesn't require proprietary hardware. It also is free as in beer and free as in freedom. Once again Nvidia is going to end up bringing a proprietary, locked down platform to compete against something free and open.

Look how well that worked out for PhysX.
 


Bullet: Free and open, almost never used
PhysX: Free and closed, moderately used
Havok: Licensed and closed, heavily used

So...what point were you trying to make again?
 

hcl123

Honorable
Mar 18, 2013
425
0
10,780


How can you prove anything if there isn't yet *any* ARM chip to compete with the likes of I7 and FX DTs ?

**Clock** counts a lot, and we don't know how ARM uarch evolution scales with clock

**DRAM speed and bandwidth** also counts and we don't know how it affects the ARM uarch

** Large sophisticated cache systems** how it will impact ARM uarch (v8).

That ARM has potentially (**already proven**) the best Perf/Watt around is unquestionable, and yet without sophisticated power management (which Intel and AMD have for DT and else).

An ARM at >3Ghz with cyclos RCM (as example) turbo and digital+temperature power monitoring, extensive clock and power gating with a sophisticated power micro-contoller with pertinent independent voltage regulators on FD-SOI with Dynamic Back Biasing Voltage and Frequency Scaling (BB-DVFS) scheme, is what is needed to compare with i7 in all metrics... a terrible beating is what i anticipate lol

x86 could contra-point with extensive speculation, Hardware Transactional Memory... speculative multi threading(spMT) (increase ILP by on-the-fly speculative threads since the simpler form of ILP is dead for x86) and high orders of memory parallelism and or *data-flow*, resorting to extensive vector crunching... but this also means the current iterations of x86 are more than obsolete, as bad as the ARM High Performance SKUs that don't exist yet


Yes... considering the ISA (instruction set arch), where x86 can resist is in sophisticated crunching (spMT,HTM etc) for vectors like instructions, meaning x86 will be more about "128 or better 256bit instructions" than anything else... 64bit integer x86 already lost... so please don't talk IPC, which traditional definition is exactly about 32/64 integer...

 

8350rocks

Distinguished


Bullet is used as much or more than PhysX.

Though, I will grant you Havok is used more than both of them...how that happens is mind boggling honestly. Though I guess it's brand recognition that some developers look for...Havok offers less than both of the others...
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Reading it was kinda sad. 10 years to go from 3.2Ghz to 4.7Ghz.

How many thought we'd be at 10+ Ghz by now?
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


Intel pushes Havok hard, Im wondering if they included their "GenuineIntel" flags into the programming. It seems since they took over AMD cpus suffer in titles that utilize the Havok engine.
 

8350rocks

Distinguished


Honestly, that would not *at all* surprise me at this point...considering everything else they've done.

Why anyone would pay money to use an inferior physics engine, when a superior engine is available for free, still blows my mind.
 

8350rocks

Distinguished


+1

I have been trying to tell them ARM doesn't scale upwardly nearly as well as they think...it just wasn't sinking in, I guess. It scales downward ridiculously well, however, to scale it upward...you're really going into larger and larger clusters of ARM CPUs to get the performance. Otherwise the PPW angle will vanish, and the hardware will have to have more and more added complexities which x86 has been integrating for 10 years now.

ARM is on the precipice of 64 bit processing, but they're not anywhere near the complexity and capability of the x86-64 ISA at this point. It doesn't mean ARM isn't viable, by any stretch, for what it's intended use is; however, it doesn't make sense from a DT OS perspective, or even a high performance server angle either.

ARM will not defeat x86-64 anytime soon. AMD will get into that market to try and grab a toe hold in ARM micro servers because of the margins. Other than that, ARM will remain THE gold standard for mobile tablets, phones, etc. It will not replace HEDT anytime soon either, as that segment of x86 DTs is clearly on the rise...by a fair margin I might add.
 
How can you prove anything if there isn't yet *any* ARM chip to compete with the likes of I7 and FX DTs ?

Historically, RISC does not scale as well per clock as X86, and RISC architectures (notably PPC and MIPS) have had issues getting clocks above 3GHz. Using those two arches as a baseline, you can guesstimate performance for ARM.

Bullet is used as much or more than PhysX.

Though, I will grant you Havok is used more than both of them...how that happens is mind boggling honestly. Though I guess it's brand recognition that some developers look for...Havok offers less than both of the others...

No, these days, PhysX (including CPU based PhysX) is used far, far more then Bullet in games. And Havok is used the most due to the fact it has the best toolchain of the bunch and the fact its licensed by just about every game engine out there.
 

8350rocks

Distinguished


While the list of AAA titles might be shorter...games running Bullet physics are more common than you think. A lot of them are Indie games using the open source Blender Game Engine (which has Bullet physics integrated into it). Though, that doesn't detract from the fact that Bullet Physics is used in just as many games.

Additionally, Havok tools are probably the easiest to use, I will give you that. Though, frankly, I think that speaks to the laziness of developers more than anything. Bullet doesn't need to be licensed to be used by a game engine.
 
Additionally, Havok tools are probably the easiest to use, I will give you that. Though, frankly, I think that speaks to the laziness of developers more than anything

We're willing to spend more to shave months of overtime off our schedules, thank you very much. It's not like we, the people who make software, have any say over the timeline of the product release schedule...

Hence: Havok over PhysX and Bullet, DirectX over OpenGL, MSVC over GCC, and so on. We like making things REALLY simple: That's why we wrote software to write software for us for a chosen target platform (compilers).
 

8350rocks

Distinguished


I am not disputing ease of use being a factor; however, there is something to be said for being able to make the engine do specifically what you want it to as well.

DX over OGL, I totally empathize with...believe me, I have been complaining about OGL for some time. When we discussed the D3D "wrapper" being used to convert D3D calls to OGL, I was somewhat surprised someone had taken the time to do it, but it would make life much easier on that end, as DX tools are clearly better setup.

As far as compilers go...GCC isn't really all that bad to use. MSVC is moderately easier, but it's not like running a compiler is rocket science anyway.

I think some of that can be lumped into "creature of habit" tendencies, and some of that is genuinely true of the quality of the tool interfaces and functionality.
 

Same applied somewhat to the 7970. inb4 GTX 880 Ultra. We can't really stop the "ERMEHGAWD ITS LIAK SHO MUCH VALUE FOR MERNEH COMPAHURRED TO THE TITAN AND HAS LIAK 13 PERCENT INCREASE OVER DEH 680 FOAR LIAK TWO HUNDRED BAWX MOAR" logic. Even the somewhat expensive back then and sexy looking GTX 480 curbstomps today's GTX 780 in value for money for their respective eras, don't even get me started on GTX 470 vs GTX 770.. If only we went back to 2010, oh 5870 how much I loved you....

Anyways, SteamOS will likely be nothing large for the PC gaming community, aside from perhaps a few Source games... There will still be a huge community of Windows Vista/7 gamers out there.

 

8350rocks

Distinguished


Yes, but we chip away at Winblows, 1 piece at a time...one small crack expands over time until it begins to become a larger problem.

Then, eventually, WE RULE THE WORLD!!! *cough* err...I mean...yeah. <.<...>.>...nobody heard that last bit right?
 

TIS UBUNTU TIAM!
 
Status
Not open for further replies.