AMD CPU speculation... and expert conjecture

Page 74 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwii

Splendid



Bigger improvement for IGPU's even for Intel. As for gaming desktops with discrete video wont be know more then a 5% improvement on average if even that.
 


And that flat 20% is why I'm calling BS on them. Those CPU's each behave differently when under multi-threaded loads. Skyrim can't use more then 2~4 separate processor units with a very large emphasis being placed on two of them. Windows Live Movie Maker (why the f*ck did they use this when handbrake or another tool would of been better) use's 2~8 depending on the height of the movie being recorded though you can manipulate it to use more or less. The i5 has four processing units, the 8350 has eight processing units, they will not behave the same under that kind of mixed load.

The i5 would have 25~40% of it's processing capacity underutilized by Skyrim and available for WLMM. 8350 would have 50~65% of it's processing capacity underutilized by Skyrim and free for WLMM. Using WLMM should of had a small effect on the 8350's performance. Using the WLMM should of had a significantly larger effect on the i5 as it has fewer underutilized resources.

I need to know exactly what buttons they were pushing, looks too much like confirmation bias.

That's why Tech Syndicate got the results they did with XSplit, the 8350 simply had more available resources to handle the additional work even though the i5 had better single core performance.
 


Interesting how this managed to get leaked, seems like people cannot keep quiet. Wonder what the point of NDA is.

They did experiment with a hybrid chip design as I mentioned a page or two ago, they managed to achieve up of 60Gb/s bandwidth. They have hybrid controllers which allows the GPU component to switch between exclussive and shared memory space depending on the load. The CPU/IMC performance is greatly improved as well.

Obviously because its AMD it will be met with scepticism, I get that, I just wonder how much is envy.
 
Non of this is actually very new, most of this has already passed numerous prototype stages. The GDDR5 is capable of squeezing 40-50% extra life out of existing iGPU's, needless to say Kaviri's will be a stock improvement of that much anyways. These new boards will be backward compatible with existing FM2 processors.

HD8600 generation graphics cards from what I heard are wholistically exclusive to requisite Kaviri APU's alone around 40% stronger than the HD6600 parts they replace, will support the Dual Graphics mode with GDDR5 support, along with improved dual graphics integration and drivers
 

$hawn

Distinguished
Oct 28, 2009
854
1
19,060


Well I think I didn't understand this.

I did a seminar on APU's last year, and if what I remember is true, I remember reading a brief note somewhere that in the 'future' APU's (ie, trinity or kaveri possibly), the CPU and GPU would be able to directly read each others data from a single common memory space.

What I assume you are trying to say, on the other hand, is that the CPU and GPU will have a common memory address space, but will consist of memory regions that will still be exclusive to the CPU & GPU.
This is already being done, even on normal systems. If you have a 32 bit system, with 512MB integrated graphics, and 4GB of system RAM, only 3.5GB of address space will be available (exclusively) to the CPU, while the rest 512MB will be available (exclusively) for the GPU. Although the total 4GB actually belongs to the same RAM chip, neither is able to access and modify each others data directly.

If AMD needs to reap the benefits of HSA, for example to allow the GPU to do calculations traditionally done by the FPU, then it would be very important for both the CPU and GPU to be able to read/write to a common memory.
 

amdfangirl

Expert
Ambassador


This move isn't so unexpected.

AMD did the same with DDR3 Sideport for IGPs.

Just back then the world + dog yawned at IGPs.

Silly world.

:p

 

$hawn

Distinguished
Oct 28, 2009
854
1
19,060


Well they've had 2 memory controllers before with the Phenom II series, one for ddr2, and one for ddr3.

Making it mutually exclusive would be a bad idea; DDR3 will cause the GPU to suffer from less bandwidth, while GDDR5 will cause the CPU to suffer from high memory latencies. And what about the motherboards? Would they have 2 separate mobo versions, one that supports DDR3, and the other that supports GDDR5?

IMHO, I think it's probably a mix of both in one motherboard, how exactly is the question.

I also wonder if this is the reason why AMD suddenly came out with the idea of entering the RAM market.....are they planning to launch the worlds first GDDR5 sticks for consumers?? Like a small GDDR5 ram slot for use by the GPU, that can be easily upgraded in the future? :)
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



There's nothing new about a hybrid memory controller. AMD's graphics chips have been doing it for a while. You can get an HD 7750 with 4GB DDR3 or 1GB GDDR5 for the same price. Or save $30 and get a 1GB DDR3.
 

BeastLeeX

Distinguished
Dec 13, 2011
431
0
18,810



I know that that is the Sandy Bridge i3, and even thought this is a GPU bound game, but the hyper-threading on the chip does not help at all. It even gets put to shame by The old Athlon II X4 620, which is clocked 500Mhz lower. I want to see how AMD's old triple core Rana does in this game.
 
Kaveri APU specs http://www.brightsideofnews.com/news/2013/3/6/analysis-amd-kaveri-apu-and-steamroller-core-architectural-enhancements-unveiled.aspx

1) ASRock have a listed FM3 board, the news is that although Kaveri may have a new socket, the socket itself hasnt changed and all FM2 and FM3 parts are compatible with each other, just FM2 wont have all the features Kaveri has.

2) Have AMD found a way to reduce latency effect on GDDR5 for x86 performance, if so then Kaveri may have GDDR5 only like the PS4, this is not an impossiblity. That said looks like boards will come with DDR3 and latter DDR4 varients.

3) How will GDDR5 be integrated. The first leaks out say AMD is dropping conventional DIMMS and using integrated SoC's and other tested card/PCB designs, using less surface area than conventional boards. https://securecdn.disqus.com/uploads/mediaembed/images/447/920/original.jpg

 
I kind of got lost in your 3rd point, sarinaide... Are they going to put GDDR5 chips on package with the APU? Are they going to put them on another SoC-type of thing and slap it onto MoBos?

Cheers!
 


Remember though, the 20% hit when using the second core of a BD module. So a BD module is only about 180% performance of a IB core. But also factor in the IPC difference [say 20% on average, which isn't unreasonable given the benchmarks we have], so you get ( (100 * .8) + (80 * .8) ) 144%, or a PD module being about 44% more efficient then an IB core. So theoretically speaking, a 4 module PD chip would be about ~176% the performance of a 4 core IB chip.

But remember, that is true ONLY when comparing the two at 100% work. The more time the processor is idle, the lower its absolute numbers. For example, if a IB core was running at 100%, and a PD module at 50%, then the PD module will only be about 72% of the performance of the IB core, so core loading aspects come into play as well.

You could be seeing cache related timing issues too; more cores in use increases the chances of the data being in the "wrong" L2 cache, necessitating a copy. Other low level blocking issues could also come into play. Lower order effects, sure, but effects none the less.

So yeah, I want core loading numbers, or preferably, a GPUView chart showing me which threads are running on which cores, as well as the individual timings aspects of both applications.
 


TR%20proz.jpg


A high level FPS chart seems to indicate good scaling to 4 cores.

TR%20proz%20intel.jpg


About as expected. Reasonable loading across 4 cores; very little benefit to HTT. Some improvement in more cores, but nothing that should impact performance numbers [confirmed via the above image].

TR%20proz%20amd.jpg


Two interesting things I noticed here:

1: Core 3/4 of the 8350. Higher load here then on the 6300... Might just be observation bias.

2: The 4300 with lower core usage then the 2600k. Based on that, I would guess TR is clockspeed sensitive, and usage numbers would decrease with OC's [and reverse]. Either that, or HTT is having a REALLY adverse effect.

[Also, note the lack of the 3770k in these benchmarks. Just saying...]

But heres a point of note: Despite the obviously better scaling of the 8350, note how that did NOT translate to better performance numbers. Why? Because none of the 4-core chips listed here are worked to capacity, and thus not CPU bottlenecked in any way. Granted, the 8350 would be better off as far as multitasking is concerned, but for TR by itself, 8350 = 6300 = 955BE = 2600k, and so on.

So whats "weird" about the results? Theres nothing here that wouldn't be expected of CPU cores not being loaded at 100%.

EDIT

I actually plan to look at the thread scheduling for TR via GPU View this weekend. So I plan to dig into whether HTT is causing that performance issue I noted above, as well as core loading aspects for the 2600k at stock speeds (specifically, how many threads are doing meaningful work, what cores they typically get dispatched to, etc). So if you have any specific questions on the numbers, now would be a very good time to speak up.
 
Whats funny about this is, the duo cores are falling behind in another new game just released.
It appears where on average this may have been an exception, is now becoming more the rule, and any duo core, where its challemged, is worthless for gaming
 
It would have been interesting to see how a Phenom x3 but since there are no 3 core CPUs for sale it doesn't matter, and why does noone benchmark the 3770 & the 3570 I really want to know how they compare in Crysis 3. Does anyone disagree that AMD is now always the way to go if you can't afford an i5 in a gaming PC?
 
Status
Not open for further replies.