AMD CPU speculation... and expert conjecture

Page 612 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

sapperastro

Honorable
Jan 28, 2014
191
0
10,710
Another possibility is that the specs are just flat out deceptive? I remember back in the day when certain games would ask for, say, a 1200mhz cpu minimum, I could still get them running perfectly fine on an old 800mhz AMD (yes I know these extra clocks are nothing in todays hardware, but back in the late 90's early 2000's, they were a big deal). Perhaps it is all hand holding with the CPU makers to try and push sales?

It doesn't really make sense to me either, hence why I am throwing possibilities out there. I thought that, normally, sloppy/outdated coding didn't take advantage of extra cores? Hence why you used to see dual cores being less competent than single core in games, dual better than quad, and so on.
 


The old E8600 vs Q9550 debate. And the key here was the E8600 was what, 800MHz faster and had about the same IPC? Duos were simply clocked faster then quads back then, which more then offset the two extra mostly unused cores. Now that Duos and Quads of the same arch can be clocked more or less the same, you don't see cases where Duos are faster then Quads, but you still see plenty of cases where they are 'as fast' as them.

Also remember that scaling does not automatically increase performance. Example: If you have a one threaded task that takes 10% CPU time of a single core, and you re-coded it to scale, you aren't going to see a performance boost. Why? Because the CPU isn't the bottleneck. That's why, even is some titles that scale well, Intel Quads, and even Duos, still win most of the time.
 

jed

Distinguished
May 21, 2004
314
0
18,780


This has been debated a thousand times over, the FX9590 is no better then a FX3850 running at the same clock speed as seen here

http://www.headline-benchmark.com/results/d2e23042-10b6-47ed-94c3-a0fee6e31b15/9a411141-526d-4452-8e78-1c9891c2efa0

Both the FX 8350 and 9590 performance is close to the i7 2600k as seen here.

http://www.headline-benchmark.com/results/c309c1f4-6f00-4def-8127-b8d679fe5989/9a411141-526d-4452-8e78-1c9891c2efa0

and here

http://www.headline-benchmark.com/results/c309c1f4-6f00-4def-8127-b8d679fe5989/d2e23042-10b6-47ed-94c3-a0fee6e31b15.

8350rocks you could have installed a custom water cooled loop on your FX8350 and the result would be the same as the FX9590 period.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I don't think I would ever call a EA developed game properly coded. They are notorious for publishing games that shouldn't make it out of beta status and then never completing them.

But my point is that the trend of "more cores is impossible" continues to get blown out. I am not here saying that it's a good thing that Sims 4 wants 4 cores, I'm saying that it happened at a time when people are still thinking games won't scale to those many cores very well.

Scaling to all these cores is not an easy task, and I absolutely agree with you that there's a ton of problems with doing it. But it simply has to happen if games want to continue to get more advanced. Yeah, it might use 4 cores and it might not be all that much faster than if it used two cores to their full potential, but scaling to more cores is something that's being worked on at least.

Of course, I could be wrong and it might not even scale to 4 cores at all and required specs might just be written by someone who doesn't know what they are talking about. We will have to see when it launches. But a poorly performing engine that uses 4 threads has the potential for higher performance than just sitting on one core and praying that single thread x86 performance magically gets far better than Haswell.
 

logainofhades

Titan
Moderator


Kinda like how they are recommending an i5 for Sims 4? :lol:
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
AMD's Richard Huddy responds to Mantle criticism:

Q1) Mantle only works with AMD hardware, as a result what incentives are there for more game developers to adopt Mantle when Nvidia and Intel have a larger share of the PC graphics market? Evidence of which has been presented by Jon Peddie Research and Steam’s hardware survey several times.

Right now Mantle only works with AMD hardware, yes, that’s true. But AMD has created what could become the foundation of a new Open Standard. That means that AMD is considering publishing an open SDK later this year, and at that time it would be up to NVIDIA and Intel (and anyone else who wants to consider this path) to decide whether they want to adopt it. If they do so, then they should be able to show performance wins like we have done – and that’s good for all PC gamers.

We already have somewhere in the region of 70 registered developers actively working with Mantle (7 with publicly-announced or released projects, so it’s pretty clear that Mantle is very attractive to the PC development community. That number of 70 is up very significantly from forty in May. And remember that Microsoft announced DirectX® 12 in March this year – so it’s clear that developers see good reason to move to Mantle. Indeed our momentum with Mantle is only increasing – and the simple reason is that it helps solve developers’ problems. Developers want to unlock the potential of the hardware – and Mantle lets them do exactly that.

The publicly announced titles for 2014 include:- Battlefield 4, Thief, Plants vs Zombies Garden Warfare, Dragon Age Inquisition, Civilization Beyond Earth, Battlefield 4 Hardline. There have also been announcements by Crytek that they are including support in Cryengine and Oxide Games said they’re doing the same with their Nitrous Engine.

Q2) Mantle can act as a stepping stone from DirectX 11 to DirectX 12, as AMD’s whitepaper explains, but what incentives are there for game developers to use Mantle as a stepping stone when they can just miss it out altogether and go straight to DirectX 12?

I guess the incentives for jumping to Mantle comes in several guises…

(1) It’s a handy stepping stone to DirectX 12.

(2) It’s possible to address the many millions of gamers using AMD hardware right now, rather than waiting for a new version of DirectX which is not scheduled to ship until the end of 2015.

(3) Any extra features in AMD hardware now, or in the future, will be accessible through Mantle now or in a future version.

Q3) “Mantle will lose out to DirectX 12 simply because Intel, Nvidia and other game developers have more reason to trust Microsoft than AMD”. What is your response to that?

Well, I guess you must be underestimating how much trust AMD has right now. The large number of active developers is a clear indication that games developers see Mantle as a great solution to some of their important problems. You might want to talk to some of the publicly disclosed advocates of Mantle. They can explain their position themselves, and it’s clear that there is a great deal of passionate support for Mantle.

We can see exactly that, in Oxide Games blog about the next generation of API’s where Dan Baker answers the question in his blog (http://www.oxidegames.com/2014/05/21/next-generation-graphics-apis/). “Does D3D12 mitigate the need for Mantle? Not at all. Though it is difficult to deteminre without more complete information on D3D12, our expectation is that Mantle will have the edge of D3D12 in terms of CPU performance. This is because that GCN architecture by AMD is the most general of all the current GPU’s and therefore the driver needs to do a bit less”

We have also seen similar from Kevin Floyer-Lea from Rebellion in his interview with dsogaming.com – “The bottom line is that we’ll support and use whichever APIs give our players the best performance. D3D12 is definitely a step in the right direction in that regard. As it currently stands there is no reason to see why D3D12 and Mantle can’t co-exist – especially if it turns out that D3D12 is limited to newer versions of Windows. If nothing else Mantle is establishing the importance of low-level, minimal APIs, for which we’re very thankful.”

Q4) “With Mantle AMD undercut Microsoft, AMD just wanted to be the first to produce a low-overhead API”. How far would you agree with this?

I think I’d put this very differently. AMD didn’t “undercut” Microsoft, instead AMD lead the way in bringing low level APIs into the 21st Century.

Q5) “Mantle cannot succeed overall without getting traction in the console market”. Is this true?

We have already validated the initial success and future outlook for Mantle with support for Crytek, Thief, Battlefield 4, and the more than 70 or so active developers now supporting Mantle. I’ll give you two more numbers to demonstrate the success of Mantle.

Number one – we have (as I mentioned)somewhere in the region of seventy active registered developers. I think that’s clear proof of traction if you want it.

Number two – we expect to have more games published in Mantle’s first year than there were games published using DirectX 11 in its first year. That’s amazing. As you mention AMD is not the only player in the PC graphics market – but we are clearly having an impact that’s simply astonishing.

Q6) “The creation of Mantle was a selfish move designed to reduce the importance of the CPU in gaming with the ultimate goal of making AMD CPUs more competitive with Intel CPUs.” Does this have any substance?

No, this has no substance.

Having Mantle is a public spirited move that allows games developers to fully expose the potential of any hardware which runs Mantle. When we publish the full SDK that will means that Mantle will allow Intel and NVIDIA to fully expose any untapped potential in their hardware too Mantle is all about solving developers’ problems. I find it hard to see how giving games players a better experience on our hardware can be seen as selfish. It’s the developers and the players who benefit.

Reducing the importance of the CPU in gaming is a direction that must be considered.. AMD’s idea was so good, that we’re seeing others take a similar path as they’ve realised that doubling down on the GPU as the sole arbiter of gaming performance is a great idea for gamers.

Before Mantle it was often the case that DirectX 11 or OpenGL based games would have artificial bottlenecks in them which meant that the full potential of the platform was under-utilized. Mantle does a great job of removing those bottlenecks and allowing the games developer to deliver everything the platform is capable of… It’s a win for developers and it’s a win for games players.. What’s not to like?
 

jdwii

Splendid


Gamer is right about this and i tested it on my phenom and FX CPU. For example dead island riptide uses 4 cores well but if you lock it to 3 cores it performs the same with V-snyc off. Also BF4 also uses all the 8 cores well but if i lock it to 5 it still performs the same. The one area that helps with doing this gamer is Amd since its cores get maxed out faster but at the same time that is why a Intel quad core still comes even or wins to a 8 core fx in those cases.
Edit
Not to mention a laptop core will get maxed out even easier meaning this game scaling well to 4 cores(might just be 3 people like sims3) is a good thing.
 

jdwii

Splendid
Man who was talking to that guy kinda sounds like an idiot to me. But either way i still see no point for mantle to exist once directx 12 comes out maybe to continue pushing microsoft but Open GL next should do that. We will see how many dev's choose it after directx 12 i can clearly see its a big hit i mean if GTA5 is going to have it that is probably the most amazing thing i ever heard about mantle, rockstar and nvidia have always been major partners. But history isn't in Amd's favor many API's generally don't exist at least in terms of popularity, you have to admit Juan Amd really should of put mantle on linux to that would of helped it a lot.
 


AMD's cores max out faster as they are weaker. That coincidentally increases the chance that one heavy thread can bottleneck that one core, which could decrease overall performance. AMD is much more sensitive to ANY workload where a single thread does a lot of work.

I'm also glad someone finally decided to do a test where they disable cores one by one and check if there's any detectable decrease in performance. BF4 scales to at least 8 cores, but as you said, dropping to 5 has no negative impact. That extra scaling isn't getting you any performance, yet carries additional overhead that has a global performance cost.
 

8350rocks

Distinguished


Ahh, but what would I have done with the FREE FX9590 that AMD sent me? Let it rot? I think not...

All in all, I built a case that would cost you $1600+ to reproduce for a few hundred simoleans...

I still have my FX8350 system sitting around, completely intact...honestly thinking about converting it entirely to ubuntu 14.04 and using it as a dedicated workstation...
 

Ags1

Honorable
Apr 26, 2012
255
0
10,790


Free??? Some people have all the luck!

 

jed

Distinguished
May 21, 2004
314
0
18,780


Well a free FX9590 is a good CPU to have no matter what you're running, but for anyone to buy one
that's running a FX8350 is a waste of money.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Last consoles devote six cores to games. We will see lots of new games optimized for six cores. BF4 is just one of them. Huddy has mentioned that six-eigth cores is the target for game developers.
 

jdwii

Splendid


Juan not that i'm disagreeing with you but i don't really care what anyone says from Amd-Intel-Nvidia
 


Again, consoles are a special case, due to having exactly one set of HW to code against. You can do all sorts of low-level optimizations that you CAN NOT DO ON A GENERAL PURPOSE PC. This is highlighted by previous generations of games that didn't scale to that many cores on a PC.

Look, I've worked on systems where you actually code against the HW. For example, an internal switch takes x ms to flip and settle, and a NOOP takes y ms, so I have to put in z NOOP instructions after I tell the switch to flip. Consoles are a lot like that, though at a much higher level all the same, but you can still do those low level code optimizations. Example: You can guarantee that data will be in specific locations in RAM, therefore it will take a constant amount of time to load that data into a CPU register, so you can do something else for exactly some amount of time before going back in finishing your calculation. By comparison, a PC will sit around doing nothing waiting for that data to get into a CPU register. That's the type of stuff you can get away with on any piece of specialized HW, and why a PS4 can play BF4 and a E4000 series C2D with a 7800 GTX can't.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
The title thread is "AMD CPU speculation... and expert conjecture". It is a bad title, in the first place because APUs are more important to AMD than CPUs, and second because we also want discuss about GPUs, SSDs, and SoCs.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Poor threading of previous gen of PC games was a direct consequence of the hardware used in old consoles, as game developers know.

My point is that 8 relatively weak cores @ sub 2GHz force developers to go wide. Once the engine/game is well threaded for the six cores availables to games, then porting to PC means that PCs with six/eigth cores see the cores used. I gave before a profile for BF4 showing all the cores of FX-6000 series loaded near 90%. And there are more profiles showing this.
 


Which was a highlight of DX11's Multithreaded Rendering, when it actually works. But as game devs know, DX11's multithreaded rendering model is very difficult to work with. That's one area that's getting fixed in DX12.

Also, I find it somewhat odd you first state that the old generation of consoles were widely threaded (true), but then state that same design is forcing devs to do the same on PC's, despite the fact they didn't do that last generation. Once again, you see something, then misunderstand what it means.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Are you implying AAA games for PS4 and Xbone have significant amounts of hand written assembly?

EDIT: on an unrelated note, I just snagged 4 6 core Opteron 8431s for $35 total. I'm really hoping we get a unified ARM/HEDT/Server platform, it's going to make for some absolutely amazing multicore systems once new Opterons come out and people sell the old ones on ebay. If I had a 4 socket motherboard with the BIOS and power delivery of a HEDT board I'd giggle like a school girl all day long.

Still can't believe the price. It should be competitive with Intel 8 core in multi-thread and it only cost 3.5% of the cost.
 
AMD talks a little bit more about Seattle
Hot Chips 26: The real deal in ARM A57-cores SoCs
http://semiaccurate.com/2014/08/28/amd-talks-little-bit-seattle/
same old stuff that was already known. amd is very, very tight lipped about performance.
preliminary uncore for project skybridge is here.
amd chose 8x relatively weak cores instead of more powerful x86 cores because of the type of workloads this soc will handle.
The last question was one AMD answered quite well, essentially why would you need eight relatively weak cores on a die instead of big x86 ones? It all comes down to workloads, specifically non-cachable workloads like, oh, search and social media. If a core only needs a small code block that operates on a large data set, a low IPC core can be just as efficient as a bigger high IPC core. Why? The work is dominated by cache misses and other types of stalls, loading from main memory is key. For this a simple, cheap, and power efficient core on an SoC with lots of RAM are just the ticket.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780

So how are K12 and "Zen" going to be positioned?

K12 for wide and undemanding tasks, x86 for high demanding tasks that are the opposite of K12's tasks, and then use as much of each chip (uncore?) as possible in the other one?
 

jdwii

Splendid


You know what we been seening the last 100 pages quotes and benchmarks from outside sources that are not straight from the bias mouth, also i agree with juan the title should be changed
 
Status
Not open for further replies.