AMD CPU speculation... and expert conjecture

Page 518 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
see, what you call unneeded and redundant, i call relevant, important and verifiable. you call "myths" about latency, yet provide nothing verifiable to bust the myth. however, you did deem the following "needed", may be that's why you posted it:

i'd like you to clarify what this means. i hope others verify what you state.

seems like you never read what other posters write. oh well...

in my speculation i am using gddr5 for the igpu while keeping ddr3 as main system memory. may be yours is different.:LOL:
and the rest:
all stem from the initial misreading.


i wasn't even using DDR3 2133 MT/s. i was using ddr3 1600 (pc 12800) as baseline since it's the widest, one of the cheapest available and delivers around 10GB/s per channel according to the sandra benches i ran.
i see you're still mistyping MT/s as MHz despite cazalan's correction. i admit that i frequently make the same mistake. from what i read yesterday, i can tell the difference now.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


As AMD mentioned during last conference the TAM for x86 is decreasing, while it's increasing for ARM. By this reason AMD goes AMbiDextrous and by offering OEMs pin compatible x86/ARM solutions it gets to play in both markets, as well as benefit if one increases at the expense of the other. AMD also gave the next slide with market predictions

Screen%20Shot%202014-05-05%20at%2011.25.04%20AM.png



Bulldozer problem wasn't that it was designed for servers, but that the Bulldozer arch. was a complete failure and didn't work for servers! As a consequence, AMD market share in servers declined to current 4% or so.



I agree completely with this analysis. Well done!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Thus not only you see the word "ARM" when I write about x86, but you don't see "ARM" when you write it explicitly in your answer? Now I understand perfectly why AMD says one thing and you read the contrary

AMD: K12 is a new high-performance ARM-based core.
8350rocks: No. K12 != ARM.

AMD: The TAM for x86 is decreasing, while it's increasing for ARM.
8350rocks: No. The TAM for x86 is increasing and it is gaining ground against ARM.

AMD: ARM and x86 cores will be treated as first class citizen. ARM will win over x86 in the long run.
8350rocks: No, AMD considers ARM will be a niche market.

AMD: Bulldozer family was a failure.
8350rocks: No. Bulldozer family is not terrible. It does some things well.

AMD: We will abandon CMT and will return to classic SMT design.
8350rocks: No. AMD new cores will be based in a redesigned CMT architecture (note: CMT is a version of SMT).

AMD: We promise 10TFLOP APU by 2020
8350rocks: No. AMD cannot produce a 10 TFLOP APU for doing anything right now

AMD: All our 28nm products will be 28nm bulk
8350rocks: Kaveri is delayed because is made on 28nm FD-SOI, trust me

AMD: We are migrating to 20nm bulk and then FinFET on bulk
8350rocks: AMD is returning to FD-SOI for 20nm and then FinFET on SOI, trust me

...
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Nope. They are benchmarks that I have that include HPC workloads and that, of course, use the same OS.



If only a tenth of the problems reported here are true, I understand why Keller, Koduri, Papermaster... abandoned AMD by Apple.

And if what the new AMD CTO Rory Read is achieving is only a tenth of that he plans to do, then I understand why Keller, Koduri, Papermaster... returned to AMD.



Not only the chief architect of Bulldozer family was fired but it cost the CEO his job, it cost most of the management team its job, it cost the vice president of engineering his job...

AMD's Feldman claims that AMD has a new team and "We are crystal clear that that sort of [Bulldozer] failure is unacceptable". Time will say.
 

jdwii

Splendid


Yeah since those benchmarks were invalid and not on average i find it troubling you seem to be convinced otherwise clearly you are not a reasonable man.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Sure, because performing three changes of units instead starting with the correct unit or dividing a number by two and next multiply it by two to obtain again the original number falls into "relevant, important and verifiable".

I provided you latencies (in ns) for both GDDR5 and DDR3. I also provided two links proving my point that it is a myth and another two links from forums discussing the origin of the myth. Of course, you didn't read anything of that and if you follow the same typical patron you will ask me about giving the links or the numbers again. :sarcastic:



It is not my speculation but what AMD planned to do as was reflected by many tech sites. :sarcastic:

I see that I didn't understand your speculation. Ok. I understand it now. Well... In the first place, it is odd and goes against the goal of the integration of components. In the second place it breaks fundamental aspects of Kaveri such as huma and HSA. In the third place, as reported in AMD original docs, the DDR3 controller and the GDDR5 controller are incompatible; this is the reason why AMD did plan to support either GDDR5 or DDR3 but not both at once.



The idea of using quad channel but only with 1600MHz modules is even more weird!

Cheapest?

G.Skill RipjawsZ DDR3 1600MHz 4x4GB CL7: 151 €
G.Skill RipjawsZ DDR3 2133MHz 4x4GB CL9 - 151 €

The slower RAM costs the same than the faster RAM. :lol:

Caza unability to understand the difference between the IO bus frequency and the data frequency (sometimes named "effective frequency") resulting from the double data rate arch is not going to change the standard way to refer to DDR3 memory speed.

GSkill has a FAQ that explain those typical misunderstandings about memory. I copy some entries:

Q:
What is the difference between “DDR3-1600” and “PC3-12800”?

A:
There are two naming conventions for DDR memory, so there are two names for the same thing. When starting with “DDR3-“, it will list the memory frequency. When starting with “PC3-“, it will list the memory bandwidth.

DDR3-1333 = PC3-10666 or PC3-10600
DDR3-1600 = PC3-12800
DDR3-1866 = PC3-14900
DDR3-2000 = PC3-16000
DDR3-2133 = PC3-17000
DDR3-2400 = PC3-19200

To convert between the two, just divide or multiply by 8.
For example, 1600*8=12800 or 12800/8=1600.


Q:
Why does CPU-Z (memory tab) show only half the frequency speed of my memory kit?

A:
CPU-Z reports the DRAM’s operating frequency, but DDR (DOUBLE Data Rate) memory can carry two bits of information per cycle, so the effective frequency is double the operating frequency.

DDR memory is typically listed by their effective frequency. So if your memory kit is rated for 1600MHz, it will show as 800MHz in CPU-Z. (800*2=1600)

http://www.gskill.com/en/faq/Memory

I am sorry, but I will continue referring to 1600MHz memory as... 1600MHz memory and to 2133MHz memory as... 2133MHz memory.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Considering that the claims were made about a concrete set of well-known benchmarks, that were published, and that final measurements coincided with the claims made within some few percents of error, the lack of reasonability must be in another part, specially when you are coming back this old discussion once again.
 

8350rocks

Distinguished

Juan, post one single shred of anything you are saying to be true, is true.

When you cannot...take your ball and go home, because you are not contributing useful information to the x86 AMD conversation topic. Take your ARM nonsense, and go do whatever it is you do in your spare time besides troll this forum.

I am going to have to go over to the S|A forums and see what they did to get rid of you.
 

jdwii

Splendid


Well if we can't agree on the facts then what's the point?
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I already corrected you and you make the same mistake again.
DDR3-2133 does not run at 2.133 GHz. It runs at 1.066 GHz.
It is called DDR3-2133 exactly because it gets 2.133 GT/s.

Module assemblers like GSkill try to dumb that down for consumers but it is misleading.

Here you can see the actual part that a GSkill DIMM would use. Such as Micron's DDR3 part.
http://www.micron.com/products/dram/ddr3-sdram/ddr3-1-ghz

"Features: 2133 MT/s data transfer rate"

 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I realize you're just a layman here but some of us are engineers or studying to be engineers and have had to actually implement DDR2/DDR3 memory controllers. If I feed the part a 2133Mhz clock things can go boom.

 

truegenius

Distinguished
BANNED


as per my expectations
quote me and ask me question instead of answering
you don't know and just living in imaginary world
and instead of giving answer you are just saying that you already gave the answer to me and to roy too

This proves again that you don't have any idea about the hardware that was being discussed. You are shutting random numbers from your...
this was my question
what would you like to say to those who use features like quad sli or crossfire?

and in 2020 if someone tries to play a game on his 4 monitor setup each at 8k resolution then how these apus would be able to store the game data need for processing by igpu ? would amd use the crystals of some broken crystal ball ?
i don't even need to know the hardware, because i am asking you to give the hardware details which will handle these things of that time
and you don't have any answer and you are too stubborn to admit it
let me remake this question for current time, then it will be like this
"what hardware i need to play and record gta 4 at 1080p 4 monitor setup, budget is 2k"
answer would be like this http://www.tomshardware.com/answers/id-1966325/high-end-multi-monitor-set.html
did you saw list of configs there
this is how you should give me the reply
"future apu will have bada b bada bu bada bum which eliminates the use of sli crossfire"
oh wait, you can't give any answer because you don't know any answer and don't know how to give answer (just saw your count of best answers) so only thing you can do is trap me in my own question by asking me the config of that apu

Are you admitting that your 8K comment was FUD and that you dont have any idea of the hardware that you are commenting on?
2011, galaxy s2 relesed with WVGA display
2012, galaxy s3 with HD display
2013, s4 with FHD display
2014, g3 with WQHD display
did you say increase in display resolution
so tell me do you think that 8k on pc won't be possible by 2020 and this is why you are saying that my question was fud ?

This is all in your imagination. I never said that kill ram or HDD.

so tell me what does it mean
different memory for cpu and igpu that is not for hsa
or killing ram
for apu which is aimed for hsa the first option was not an option anyway so what i left with is "killing ram"
or is it doing something like xboxone to give a gaming experience of 720p @30fps

The 5x myth was debunked before. The upgradability issue was also replied before...

i din't saw any post where paladin admitted that dgpus will not be able to provide 4-5x more performance than an apu
indeed i always saw something like this

so where did you debunked it juan ? imaginary land !



where is your first second third or 4, 5, 6, 7, 8, time (quote and link so that i can check check them)
i got it
all these are located in imaginary land
gotcha
i will say again ( instead of just saying that i already said it 20 times ) reduce the dose

A genius doesn't make claims about hardware and latter ask which is the hardware configuration. :sarcastic:
so would you tell me the config of your apu and year too ?
or you already did it 20 times in some imaginary land
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And now you decide that a thread about future AMD products is now "the x86 AMD conversation topic". One half of AMD CPU/APU products cannot be discussed because you don't like them? :lol:



This is funny because we are currently discussing on the K12 thread and several posters here agreed with me that they want a 8-core ARM APU. Why don't go there and explain us your "K12 != ARM" and your "ARM cannot scale up" fantasies.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And you continue repeating the same confusion despite being corrected and given a GSkill FAQ link that explain the same to you.

You continue confounding the clock frequency of the IO bus with the frequency of transmission of data.

That "2133 MT/s data transfer rate" corresponds to effective frequency of 2133 MHz.

Micron has presented the new DDR4 modules recently and they mention that the (effective) frequency starts at 2400MHz.

Here you have a table from Hynix, mentioning a frequency of 2133MHz for the DDR3-2133 memory, a frequency of 1600MHz for the DDR3-1600 memory, and so on.

What is really interesting in this useless discussion is that I am not obligating you to use frequencies, you can continue using data transfers, but you want obligate me and rest of memory industry to use data transfers. WOW!

Consumer_H5TQ1G6%288%293EFR.jpg
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I suppose that you are asking for the bold part. Well, it doesn't means your "different memory for cpu and igpu that is not for hsa" nor your "killing ram". Those two are only in your imagination. This show again that you continue pretending to criticize hardware that you don't understand. :sarcastic:



:rofl:



And that post from him was adequately replied. Of course, you missed the answer, how unsurprising! :sarcastic:
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Let me guess:

Vendor A (the "Graphics Mafia") == Nvidia
Vendor B (the "idiots with software") == AMD
Vendor C ("They don't really want to do graphics") == Intel
 

jdwii

Splendid

Like i said some people live in it others live in reality
 

truegenius

Distinguished
BANNED

did you saw that i had written something else there
"so tell me what does it mean"
it is in english so i don't think it will be that hard for you to understand that you just ignored


yeah, current gen consoles are making pc gamers laugh on them


adequately replied or ignored



let me remind you my reply again
i din't saw any post where paladin admitted that dgpus will not be able to provide 4-5x more performance than an apu
so where did you debunked it juan ? imaginary land !
give me answer juan
show me where he agreed with you that future gaming dgpu will be well behind apus ?
or let me hear from paladin if he agrees with you that future gaming dgpus will be behind apu's igpu

and let me over simplify all the discussion going on
here i accept that i don't know abcd of computers ( and admit that Playing the flute to a buffalo (you) is a waste )
so take my question as as fresh question ( and think that i missed all the previous 263 pages of discussion so don't talk crap like "i said it in my earlier posts")
and now will you enlighten me by giving me the specs of your apu, year and cost ( keep in mind that 390x is supposed to have 4224 gcn cores (2x core count in 3 years so you can imagine the performance of single die dgpu by 2020 let alone dual gpu cards and multi cards config which is your apu going to take-on) and it is only 2015 so your apu should be well ahead of 390x)
and how will it cope with future high resolution gaming, what resolution you are expecting this apu to be able to handle for lag free gaming
 
AMD's new stratergy looks sound to me.

Being able to mix and match X86, GCN and ARM logic onto processors is fantastically flexible.

Their big cores do need some work- although I think most of the problems with the earlier cores have been a matter of *when* they were released rather than the product itself being fundamentally that bad. Phenom I was late (and there was a bug admittedly). Phenom II sorted out much of the problems and was actually a pretty good product and was competative against Core 2 Quad. Phenom II X6 was a server part re-purposed as an answer to the first gen Core processors however that was when Bulldozer was intened to be released (and actually bulldozer doesn't look so bad against the frist gen Core i7). Trouble was bulldozer got delayed until well into the life cycle of Sandy Bridge.

One thing I have noted- AMD's high core count cpu's have been aging pretty well. I'm running on a Phenom II X6 which I got years ago and it's still fine now. I think part of this 'the pc market is doomed' idea comes from the fact that pc sales have dropped- but a large part of that is that PC's last so long now.

My parents have only just replaced their main system, which was rocking an Athalon 64 X2 5000 from the year it was released (2005 I think). For web browsing and light duty stuff that was more than enough power for them. Unless you're into the competitive gaming scene there is little reason to replace a pc other than due to hardware failure (in my parents case the ancient IDE hdd was pretty much dead).
 
i've been reading more on memory, took a lot of time to understand a few things. most of these are new.

the above is a straw man argument. ironically, your straw man weakens your own claims i.e. the bw numbers you posted. if you mock the underpinnings of the end results that you posted yourself, you're only ridiculing yourself. it woulda been correct if the end bw numbers didn't match up or if the calculations were wrong. the fact that You failed to provide details of your own claims and went on trying to ridicule the mechanisms shows your lack of understanding and research. this is as far as you go on this matter. i don't want to be a part of your fallacy anymore.

i read them. it's not that they lack credibility, some of them do lack because random forum users arguing in a thread doesn't make for a credible source. still, the data sheets and latency calculations showed the absolute latency being similar, or close. if i am to take that as correct, then that pretty much invalidates your claim of ddr3 in pcs being "slow" and gddr5 being "beyond" ddr3. but if those are just gddr5 proponents trying to force their point home using mathematics instead of measured findings, that's another story. why? because then other factors like cost, memory access protocols, power use, complexity etc. come into play.
i do ask you to provide more information though. for example, explanation of this following:

i'd like you to clarify what this means. hopefully, others will verify what you state.

i wasn't concerning with amd's plans or other tech sites. i was trying to imagine kaveri as mainstream consumer product aimed at casual pc gaming.

it complies with the integration, it actually takes advantage of it. using on board gddr5 ram as vram, eliminates the need for a cape verde-class discreet gfx card and enables a lot of gaming performance in an intel nuc-like enclosure. ecs has released such a motherboard very recently. a bit of tweaking that concept can easily make it real. another advantage would be the ability to use an on board discreet gpu with dedicated vram, reducing vertical size of the whole pc.

i don't know much about huma, but i explicitly ignored hsa because at present, no games support hsa. since the apus are in mainstream consumer parts, those pcs are highly likely to be replaced by something fully hsa-compliant and running widely available hsa-enabled software in the future. right now, the hsa aspects of kaveri only benefits software developers and enthusiasts who want to play around with hsa, no appeal to casual crowd.

if true, this is an important factor, since amd was in a position to choose either but not both. i speculated that amd choose both (at the expense of die area). imo the gain in igpu bandwidth woulda been worth the extra die space ( for aggregate 128bit bus).


pricing changes on a daily basis. ddr3 prices have been on the rise since memory manufacturers shifted to mobile - explains why the 1600 kit is so expensive. i'd still address this one. i saw the price you posted and immediately suspected why you didn't use a u.s. shopping site. turns out that the ddr3 2133 cl9 kit is 1.6v and is a fair bit more expensive. that proves how pricing is not really a good excuse in this case of an imaginary apu. if had to really pick a quad ch. kit, i'd pick one with 1.5v ddr3 2133 4x 8GB or 16GB when they become available. if had to go cheap, i'd pick a random 4x 2GB ddr3 1600 kit for around $90.

aww. you're trying, in futility, to play a game of semantics using naming conventions and actual measurement units. poor try.

that's your personal choice and i respect that.
 


Pretty much. AMD/ATI is well known for HORRENDOUS OGL driver support over the years.

Vendor D would be Qualcomm, who's mobile OGL drivers don't even come close to meeting OGL spec. You have no idea how many times I've seen open source projects have to work around OGL problems in their driver stack.
 
Status
Not open for further replies.