Does High-Speed DDR3 Help AMD's FX? Four 8 GB Kits, Reviewed

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Crashman[/nom]You're showing me a JEDEC DDR3-1600 CAS9? Here I thought we were talking about performance RAM.Since we're talking about two different things, there's a good chance that we're not actually in disagreement.As for the 1.28V thing, well, I actually left the JEDEC updates for low-voltage RAM out of this discussion of non-low-voltage RAM.[/citation]
Find posts of mine @ TH where I'm recommending >DDR3-1600 CAS 9 RAM; it'll be a long l-o-n-g search. Laser-focused on AMD FX CPU lines 9/10 times I'll recommend DDR3-1600 CAS 9 and in a few very rare cases where the OP 'might' get a tad extra performance (based on use/request - after I've lectured the $ per performance) then maybe DDR3-1866 i.e. native to their CPU.

To clarify, I had (2) separate ideas and statements:
1. Possible reasons the (2) kits failed
2. AMD 'optimized' RAM (like the picture above)

Real world, run RAM Disk and there you'll find some gains from the increased memory frequency relative to the CPU in question and it's rated 'Max Memory Bandwidth' per spec.

My observation, again, is the (2) sets that failed and the probable reasons for failure (i.e. compatibility). Mainly their Frequency to CAS isn't ideal for the AMD CPU that you tested -- that's it for this wild goose chasing argument.

And yep I like both DDR3L & DDR3U IF your CPU can run them! :)

In the 'Big Picture' I really don't care. AMD has always had a mediocre and sensitive IMC, and NEVER do I recommend DDR3-2400 to anyone -- I don't even care for those sets -- too much a PITA and too damn many BSOD's for my appetite regardless of Intel (SB, SB-E or IB) or for sure AMD FX. Yeah, I can get them to work but since I use RAM Disk I don't want the risk of e.g. outputting a couple hours of SQL 'garbage' or other failure. Gaming there are multitudes of benchmarks already out there.
 

msgun98

Distinguished
Apr 26, 2010
67
0
18,630
0
[citation][nom]noob2222[/nom]take a small enough sample to make sure the results don't prove a thing. why not use some programs that do respond to memory speed?[/citation]

Throwing a benchmark with DDR3-1066 on an i7 K processor is like buying a 680 and powering it with a Diablotek 350w PSU.
Seeing that most people with an i7 K processor is most likely looking already at DDR3-1600, there really is NO significant difference. Unless you think 1.16fps is worth the cost of buying DDR3-2133 ram.
 

ashinms

Honorable
Feb 19, 2012
155
0
10,680
0
Good to know. I just bought an fx8150, and am running DDR3 1600. Definitely a word of consolation that I'm not losing out by not getting higher-end ram.
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
3
I would have liked to see this memory performance comparison run on an APU instead of using a discrete GPU. In particular, a recent Intel vs AMD APU article hinted that the AMD APU could do better with faster RAM. Here's some much faster RAM - let's see what it does for APU performance, both for games, and again for apps.
 

Rockdpm

Splendid
I am sure we could argue about which is better, intels memory controller/ performance boost, vs AMD's mem controller/performance boost. But can we just get along and have a positive discussion?
 

eddieroolz

Splendid
Moderator
I don't think the performance increase justifies the cost increase to be honest. Unless we have a fairly comprehensive list of applications that benefit from higher RAM speeds, I think it's better to stick with around 1833.
 

SinisterSalad

Distinguished
Mar 5, 2008
457
0
18,810
14
60% more cost for 6% performance gain? On one application tested? I'll pass. I don't care if it's only $25 more. That's 5 to 6 more beers I can get. Priorities, man. Priorities.
 

mousseng

Honorable
Apr 13, 2012
672
0
11,060
48
Wouldn't it have been better, since this was about improving Bulldozer CPUs, to have included some game benchmarks that were CPU-dependent as well? Because I thought Metro and Dirt3 (despite being a 'physics-based' game) were relatively lenient on the CPU.

It's definitely good to have those two though, to show whether a change would occur in something less affected by CPU performance. Overall I thought it was a good article.
 

redeye

Distinguished
Apr 29, 2005
225
0
18,710
9
sad thing is.. it does not matter... thus I will keep my oc'd 1100t @4Ghz, 3200HT. 1600ddr3 88824 (Btw when i benched the 8150fx in my system it was slower ( but used less power at idle 10 watts) and i could not oc'd it to 4.9-5.0Ghz which would have made the 8150 faster... so I wish for the day that i can replace my system with an intel 3770...

 

taltamir

Distinguished
May 9, 2008
18
0
18,510
0
1. Your charts are useless since you are comparing RAM of identical performance from different menufacturers (which, lo and behold, gets identical performance in testing!) rather then comparing RAM of different performance levels (aka, each chart should contain DDR3-1600, 1866, 2133, 2400 compared for a singular test).

2. Does High-Speed DDR3 Help AMD's FX? No, nothing can help the abortion that is BD
 

iLLz

Distinguished
Nov 14, 2003
102
0
18,680
0
Interesting to say the least. I guess for AMD it would just be better to get the best priced RAM from a reputable manufacturer.
 

silverblue

Distinguished
Jul 22, 2009
1,198
4
19,285
0
[citation][nom]Crashman[/nom]Please, at least try to be honest. You already knew the Madshrimps article was explicitly dishonest because it used CAS 8 timings at all speeds (even 2133). Real-world memory supports tighter timings at lower frequencies, and requires looser timings at higher speeds. Madshrimps gimped the low-speed tests and boosted the high speed configuration intentionally.[/citation]

I found a real gem in their article about going from 2.2GHz NB up to 2.6 and 3.2 and the approx. 2.5% performance boost an entire 1GHz of NB provides:

Nice scaling again with higher NorthBridge speeds.
Sure, it might make more of a difference in actual games, but is the noticably higher power consumption worth 5%? You'd do far better just increasing the multiplier and going from stock to 4GHz.

In addition, I should add that the article goes all NVIDIA with some of the bar graphs which can indeed emphasise difference far more than it actually should.

Bottom line - Bulldozer is the first incarnation of a very different architecture, and the one surefire way of improving performance with it is to increase clock speed. Future incarnations should be more efficient but for now, this is what we have. Assuming you take the limit off the CPU, it won't perform badly, but drop details or add in a second GPU and it just cannot match even Intel's mid-range models. AMD knew it wouldn't meet expectations... hence the focus on "unlocked" and the liquid coolers. Piledriver won't close the gap to Intel massively, but you can be sure that it'll be easily quicker and more frugal than Bulldozer.

And whatever happened to that B3 stepping, anyway?
 

dalethepcman

Distinguished
Jul 1, 2010
1,636
0
19,860
27
I would gladly pay $25 for a 6% speed increase on my computer, but then again I wouldn't buy a bulldozer unless I was running linux (piledriver for win8 maybe?) and would instead invest that $25 into an Intel chip.

Pretty much every benchmark points out, that an I-3 at 3.1Ghz is fast enough for basically anything you can throw at it for 100% of consumer grade computers. All in all the AMD chips while they aren't the king of performance, do just fine for 95% of the computing world, and I see them as less of a failure than netburst. (1.4Ghz p4 slower than 800Mhz p3 anyone?)

I am enjoying the time of fast CPU's for $200, and will dread the day AMD stops competing against Intel on the high end, as we will quickly return to the > $1000EE cpu days.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
32
If AMD doesn't improve the IPC performance of their CPUs, they'll be out of business. I really like and respect AMD. But, right now, I'll be buying Ivy Bridge. I hope AMD has a good CPU by 2015 or I'll be forced to buy Intel again. The same applies to GPUs, if their GPUs continue to get outperformed by Nvidia at the same price. I'll buy Nvidia. Now that I'm gaming at 1080p, I can't afford to buy lower performance for more money
 

SuperVeloce

Distinguished
Aug 20, 2011
154
0
18,690
1
Pretty much every benchmark points out, that an I-3 at 3.1Ghz is fast enough for basically anything you can throw at it for 100% of consumer grade computers. All in all the AMD chips while they aren't the king of performance, do just fine for 95% of the computing world, and I see them as less of a failure than netburst. (1.4Ghz p4 slower than 800Mhz p3 anyone?)
lol 100%? Not everyone is only surfing/playing games. And even gamers... not everyone plays badly written games, which can only use one or two cores.

P4 was slower because of its extra-long pipelines compared at the same clock, not at half speed. lol
 

saturn85

Honorable
Mar 26, 2012
10
0
10,510
0
i guess high speed ram will show a different in folding@home.
hope to see folding@home benchmark with high speed ram.
 

robot_army

Distinguished
Jan 18, 2010
10
0
18,510
0
Would It be possible to see one of these kits working with a APU to see how the CPU-GPU combo benifits from fast ram, and 8GB Vs 4GB?
 

callmesmorz

Honorable
Feb 25, 2012
15
0
10,510
0
[citation][nom]noob2222[/nom]take a small enough sample to make sure the results don't prove a thing. why not use some programs that do respond to memory speed?[/citation]
this IS an article about bulldozer
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
0
[citation][nom]SuperVeloce[/nom]lol 100%? Not everyone is only surfing/playing games. And even gamers... not everyone plays badly written games, which can only use one or two cores.P4 was slower because of its extra-long pipelines compared at the same clock, not at half speed. lol[/citation]

Actually, you're wrong. The pipelines were long, but they were double-pumped, and in any rate, when they went from 20 to 31 stages (Prescott) the performance was pretty much the same (due to other enhancements).

The vast majority of the people who think they know spit out this as the reason, but the main reason was the trace cache was only about 50% effective, and because only one instruction per cycle could be decoded into the trace cache, 50% of the time the processor was running as a scalar processor. When it ran out of the Trace Cache, the performance was very good, but when it didn't, which was quite often, the performance was abysmal.
 
Status
Not open for further replies.

ASK THE COMMUNITY