AMD CPU speculation... and expert conjecture

Page 611 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

colinp

Honorable
Jun 27, 2012
217
0
10,680
Man, I hope that 8370e isn't just a figment of wwcftechs imagination. That r9 285 itx card looks nice too, would be very happy if one of them came out with a negative pressure (backward venting - but not reference) cooler. Those would be my rig upgrades for the foreseeable future.
 

yes it will. but amd has no other cpu option. fx platform has no mini itx motherboard and if it did, only more power hungry fx6300 and higher cpus would be viable. mantle and dx12 will help alleviating cpu bottleneck in gaming.
considering intel, you'd have to spend over $180 to get a decent quad core i5 cpu or $120-160 for a haswell-refresh core i3, z97 chipset for overclocking. athlon x4 860k will likely cost under $100-110, offer 2 sr modules, overclockable, cheaper motherboard options will be available. the overall price will be cheap enough and considering intel alternatives the athlon should be a good option. the only intel candidate(cpu) under $100 price range is the pentium g3258, but it's got 2 cores. if you get intel then you'd be giving up non-gaming application performance advantage (at that price range).

@juan: your "updated CPU/GPU/APU roadmap" looks kinda fake. may be it's real... i dunno.. it looks fake to me. where did you get it?

 


In the ultra low power field? RISC and MIPS CPUs are king, because even ARM is too power hungry. Heck, there's a reason why the Z80, 68000k, and i386 are still used a lot in the embedded market: Because they are cheap, low power, and get the job done.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Some more info from SIGGRAPH coverage of OpenGL-Next:


  • ■ OpenGL-Next will break compatibility with existing OpenGL implementations.
    ■ The new API seeks to unify OpenGL and OpenGL ES.
    ■ There will be explicit control over GPU and CPU workloads with the application/game expressing to the driver what it wants.
    ■ The new API will be high performance and predictable.
    ■ The new API will be multi-threading and multi-core friendly with great reductions to the CPU overhead.
    ■ There will be a common shading language intermediate representation (IR) for greater reliability and portability.
    ■ The new API will be architecture-neutral with full support for tile-based rendering and direct rendering.
    ■ There will be enhanced conformance testing.
    ■ This new initiative is "NOT" going to be a "multi-year, design-by-committee process."

Besides AMD, the organizations participating in the new OpenGL-Next initiative are Pixar, Qualcomm, Samsung, NVIDIA, Epic Games, Unity, Oculus VR, Apple, ARM, VALVE, Hi Corp, Intel, Imagination, Blizzard, Sony, Broadcom, MediaTek, Google, EA, RTT, TransGaming, Mobica, and Vivante.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


As mentioned above, their comparison to a Cortex-A5 was invalid because this core has to support lots of extras that were avoided in the Rocket design. A more accurate comparison would be a Cortex-M3 plus MMU and cache.

AMD will be aiming (check above slide) the "ultra low-power client" with Cortex-A57 cores. Those new cores have improved efficiency.
 

jdwii

Splendid
http://www.extremetech.com/extreme/188396-the-final-isa-showdown-is-arm-x86-or-mips-intrinsically-more-power-efficient/3

"Companies that try to claim RISC still has enormous benefits over x86 at higher performance levels are explicitly ignoring the fact that RISC and CISC are terms that describe design strategies and that those strategies were formed in response to technological limitations of the day."
Juan lol
"The RISC vs. CISC argument should’ve passed into history a long time ago. It may still have some relevance in the microcontroller realm, but has nothing useful to contribute to the modern era. An x86 chip can be more power efficient than an ARM processor, or vice versa, but it’ll be the result of other factors — not whether it’s x86 or ARM."
http://media.giphy.com/media/vWDrezW0rMjmM/giphy.gif
http://a.disquscdn.com/get?url=http%3A%2F%2Freactiongifs.me%2Fwp-content%2Fuploads%2F2013%2F08%2Fbreaking-bad-i-won.gif&key=a45RKmSSvGnlvpfkFvALEA&w=800&h=224
 

jdwii

Splendid


Making it silly to pick over a FX 8 core or even a 6 core.
 

jdwii

Splendid


Man i agree i can't wait to read the FX 8370 reviews and they better not pick those old games. Better only have games based on newer engines like i said i benchmark the crap out of my CPU and compared it to the haswell i5-4460 and a person would be silly to pick a I3 over this CPU. In all my next gen games the 8350fx beat that CPU or came even, in older games it loses but its still at the good enough level. Under chess it lost though which made me mad a bit.
 
TSMC 16nm wafers coming in Q1 2015
http://www.fudzilla.com/home/item/35588-tsmc-16nm-wafers-coming-in-q1-2015
16nm, because 14 is considered unlucky.

AMD Announces Heterogeneous C++ AMP Language for Developers
http://www.techpowerup.com/204546/amd-announces-heterogeneous-c-amp-language-for-developers.html
Graphics Add-In Board Market Down in Q2, NVIDIA Holds Market Share Lead
http://www.techpowerup.com/204528/graphics-add-in-board-market-down-in-q2-nvidia-holds-market-share-lead.html
also took the biggest hit q2q due no not having new gpus to sell.

AMD’s Faraway Islands is an interesting story
http://semiaccurate.com/2014/08/26/amds-faraway-islands-interesting-story/
 

8350rocks

Distinguished


LOL @ Faraway Islands...

Ha! Pirate Islands is not quite here and they are seriously trying to make something up to replace it? I doubt we see pirate islands very soon, they are just now rolling out more volcanic islands stuff to trickle down the ladder...
 

Semiaccurate sounds very butthurt that they enable simple things like this because how their business model fails. It is actually quite funny. Anyone with any knowledge would know that PI's sucessor would not be in that time frame and the name faraway island doesn't even make any sense. Tech news has always been dodgy with all the rumors, just gotta avoid sites like wccftech and everything is pretty good. Anything that first gets published on wccftech is probably wrong.
 

jdwii

Splendid
Wow Sims 4 has heavy requirements i thought they where making it easier to play?
http://www.pcgamer.com/2014/08/26/the-sims-4-recommended-system-requirements-revealed-hope-you-have-a-core-i5/
 

UnrelatedTopic

Honorable
Nov 4, 2013
22
0
10,510


I think they're referring to laptop i5's because it says "or Athlon x4"
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I am baffled by how some of you are so surprised to see games started to scale to 4 cores. All of the recommended CPUs are all quad cores.

"we can't go to two cores! It's impossible to sync data!"
"we can't go to four cores! There's not enough tasks to put on 4 cores!"
"we can't go to eight cores! Everything that has made parallel is parallel!"

It's going to keep happening until someone comes up with a better way to get more performance than throwing more cores at the problem. And software developers will kick and scream the entire way.

EDIT: also, if EA is saying they're making the game more accessible to people while targeting quad core, they probably know something that we don't. Either that or they're just idiots. But if there's one thing EA knows how to do, it's make money.
 

Ags1

Honorable
Apr 26, 2012
255
0
10,790


Nice to to see a 9590 result on my website! Are you changing your handle to 9590rocks? Anyway, I want to show off, so here is the 9590 stacked up against my awesome new Lenovo A10-5750 laptop:

http://www.headline-benchmark.com/results/d2e23042-10b6-47ed-94c3-a0fee6e31b15/c9538cdf-0f1f-4748-bf31-2779920abd0a

I gave up waiting for the steamroller laptops. They're selling off (dumping) the Richland models cheap, so maybe I should have waited another month. But then I'd have to pay launch prices for maybe 10% more CPU performance and flatlining graphics.
 

jdwii

Splendid


I'm quite happy its using more cores but older PC's are usually the market for these games most users probably play it on laptops. Sims 3 was so unplayable after playing it a few days it just ruined the experience to me so i hope it uses all my hardware well instead of 20% of it.
 

jdwii

Splendid
Was going to upgrade my llano to a steamroller laptop but i think i can hold off until Amd's next design or Amd's last module design for the laptops. Also i tried running that benchmark have 8 and did download the latest java didn't work
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


That was the point of the fake tags to see how quickly the "news" would be "leaked".
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The whole history is about more than a simple leak. Charlie/Semiaccurate accuse them of false attribution as well.
 


CPU scaling does not infer increased performance though. And the requirement of a quad for a very easy to process title indicates the software is VERY inefficient. If we've gotten to the point where the Sims is computationally more crushing then just about everything else out there, we have significant problems to take care of.

Really, are we at the point where we're excusing poor coding and decreased performance because it "scales to more cores"?
 

sapperastro

Honorable
Jan 28, 2014
191
0
10,710
I am guessing the sims 4 will head to the new consoles? If so, that could be the answer for the multi threading of the game. Perhaps it needs to be set up specifically to operate on X amount of cores, with different operations going to each core out of 4 used for the game.

 


Which isn't how threading works, since you don't ever lock specific threads to specific cores; that's just asking for trouble.

I can see how you can make Sims scale; you do have a lot of AI's you need to cycle through, and the majority are fully independent of the rest. It SHOULD scale well. That being said, the TOTAL WORKLOAD should still be trivial enough where a Dual core can handle it; And according to the specs, it can't. THAT'S the issue I have with it.

That's the issue here: What is the new Sims doing under the hood that makes it so Duos will have issues with it? Because if graphics are responsible, then EA really doesn't get their target market here. If its something else, then EA's design is flat out WRONG, because there is nothing under the hood that should be significantly more stressing then the previous title was.

Seriously, a C2D should be way plenty for titles like this; its not that hard to handle CPU side. [And yes, I've written AI simulation software before. For simple behaviors like you see in the Sims, it's not that hard to do.]
 
Status
Not open for further replies.