Origin PC Eon11-S: Great Gaming Performance From A Tiny Notebook?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


People mistake DDR3 and GDDR3 because of their similar performance and similar naming convention, but they're not the same.

GDDR3 is a technology that is based on DDR2.
DDR3 is also based on DDR2, but it's a different branch from GDDR3 and the two are different.
GDDR5 is to DDR3 as GDDR3 is to DDR2. Basically, it's a derivative of DDR3. It incorporates many of the differences between GDDR3 and DDR3 with some additions such as an additional clock frequency to let it mimic a quad data rate.

So, we have all of these similar names and people often make mistakes about them. There is no context where it is actually correct to refer to any of them as the other. So, there is no time where it's correct to refer to DDR3 as GDDR3 and vice versa. Some graphics cards have GDDR3 memory (common on many older cards) and some have DDR3 (common on low end newer cards), but again, these are different memory interface technologies.
For example, the Radeon 4850 has GDDR3 memory, but many newer cards have DDR3 memory such as many lower end VLIW5-GPU based cards such as the 5670/6670, 5570/6570, 5550, 6450, and 5450.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Hm... Wait... Are you saying that "DDR3" (not "GDDR3) on graphics cards (which you said is based on DDR2) is different from the "DDR3" we commonly know used with system RAM? :)
 


No, that's not what I'm saying. DDR3 is still DDR3 on a graphics card and on a system memory module AFAIK. I'm saying that it's not the same as GDDR3, although some people incorrectly use the terms as if they were synonymous just like some people incorrectly call GDDR5 memory DDR5.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Ah... Okay. So they use plain "DDR3" RAM on graphics cards as well... That's something nice to know. Thanks! I wonder... Does that make those kinds of cards (DDR3) have about the same bandwidth as AMD's APU's (if we hypothetically disregard the CPU's usage of the system RAM)? :)

I just got confused with this.
Sorry about that. The thing is, when I say "based" I think of GDDR5's relationship with DDR3. But I see that you meant that DDR3 is based on DDR2 in a way that it's built-up on it, i.e. an upgrade to a newer generation. Hehe...
 


Not necessarily the exact same bandwidth (depends on the frequency), but fairly similar, yes.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Oh yeah! We (or maybe just I) forgot about that! That should make a big difference. Though I wonder if the memory controllers on graphic cards tend to have better scaling compared to Dual-channel RAM. As well as Triple- and Quad-.
 


Well, yes there is that, but most of the graphics cards that compare with the Trinity APUs are all 128 bit cards.

128 bit cards have comparable bandwidth to a dual-channel DDR3 system with about the same frequency and 64 bit cards have comparable bandwidth to a single channel DDR3 system with about the same frequency.
 


CPU performance scales poorly with single/dual-triple/quad channel. I'm pretty sure that IGP performance doesn't scale poorly with it.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Oh, we thought of the same thing blaz (at least the "Dual-channel" part). :lol: I also thought of something which I didn't mention. AMD should get to making Triple- or Quad-channel memory controllers for their APU's. Hehe... Though I wonder if 1866(or 1600)MT/s in Dual-channel is enough already since graphics performance already has quite the diminishing returns from there, well at least for this generation of APU GPU's.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
I was referring to IGP performance of course. :)
 


Triple/quad channel DDR3 controllers most certainly would still help. Just look at the Radeon 6670 DDR3 versus its GDDR5 counterpart, the GDDR5 wins and often by large margins. The A10s have a fairly similarly performing IGP compared to the Radeon 6670's GPU, so I'd expect it to perform more like the GDDR5 variant of the 6670 if the A10s had double the current channel count with decent memory.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Hm... That does make sense. I think I was thinking of the HD 6670 (because you and others tend to compare it a lot to the highest end APU IGP), but lost the thought at some point when I was making that post.
Here's a question. Even though memory clockrate and bus width both affect bandwidth, do they individually affect it performance in different ways? For example, would a 2133MT/s 64-bit memory system perform exactly the same as a 1066MT/s 128-bit memory system? Because I find those diminishing returns with the APU IGP's at a relatively low bandwidth peculiar.
 


Theoretically, those memory systems should perform identically, but in practice, that might not be the case.
 
I honestly haven't done a test to see it width and frequency affect real-world memory bandwidth equally. Maybe I'll try that with my home computer with my main system memory sometime (use one channel in DDR3-1600 and two channels in DDR3-800, I'll try to equalize latency to reduce any variables).

It's possible that the higher frequency method is superior because you'll usually have lower latency that way, but IDK for sure and it may be more situation-dependent. For example, let's compare normal DDR3-1066 timings to normal DDR3-2133 timings:

DDR3-1066 1.5V is usually between 7-7-7-18 and 7-7-7-21.
DDR3-21331.5V is usually between 9-11-10-30/10-10-10-30 and 11-11-11-31.

From that, we can see that in real-time, a common dual channel DDR3-1066 1.5V memory kit will probably have the same bandwidth as a single channel DDR3-2133 1.5V kit, but a DDR3-2133 kit is probably going to have significantly lower latency.

I don't know if the same observed *rule* applies to memory on video cards, but it's food for thought.

Which diminishing returns phenomenon for the APUs are you referring to?
 
G

Guest

Guest
[citation][nom]Estix[/nom]It's based off a Clevo barebones, so it's the same as the Sager NP6110 which starts at $899http://www.sagernotebook.com/index [...] ame=NP6110I just wish they'd offer a screen better than 1366x768 (at least 1440x900 or such)[/citation]i was about to say the exact same thing, sager is better priced at every level than the competition
 
[citation][nom]burmese_dude[/nom]DDR5 versionhttp://www.amazon.com/Clevo-W110ER [...] uctDetailsStraight from Clevo SpecGraphics:- NVIDIA GeForce® GTX 650M with 2GB of GDDR5 graphics memory[/citation]

That's GDDR5 (a technology based on DDR3), not DDR5. There's no such thing as DDR5 yet, at least not as a marketing product. We don't even have DDR4 yet. Current product references to DDR5 are always mistaken and are actually referring to GDDR5.
 

Crashman

Polypheme
Former Staff
[citation][nom]gunbust3r[/nom]1366x768 makes it an automatic no[/citation]But you didn't say the screen was too small. Nearly anyone who thinks that a screen this small needs more pixels is simply wrong...due to font sizes and so forth, most people who need more pixels also need a bigger screen.
 
G

Guest

Guest
a 650m is insanely weak. How would it compare to one of the faster Trinity APU's? Yeah i know the cpu would be slower, but the graphics aspect should be better with the 7660G graphics built in. Not seen too much on this, but it would be very interesting to see how they compare. It may still be a win for the massively pricier intel/nvidia combo. But the vastly cheaper and less power hungry APU could be a very nice setup for a laptop.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
I'm guessing you aren't a "Retina display" kind of person then? Hehe...

No offense intended, and in fact I myself am unsure if I would notice any difference between a regular DPI screen compared to a "Retina Display." And even I do notice a difference, I would wonder if the price premium (over normal DPI devices) would be worth it, as well as the added performance requirement (which might affect both idle and active power consumption). One place I could see "Retina Displays" show an advantage is with less aliasing (as was advertised by Apple) and probably less of a need for AA.
 
Status
Not open for further replies.