AMD Ryzen 5 1500X CPU Review

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965


2jxiq1.png


https://www.youtube.com/watch?v=VNjkDoYZYjU

The 212 EVO reduces temp 20C under full load vs. waith spire! It is worth buying a 212 EVO@$29, because you can't overclock higher anyway, so don't waste your money on a more expensive cooler!

 

InvalidError

Titan
Moderator

Or just overclock 100MHz less, reducing your power dissipation by 10-20W, forgo the 212+/EVO to save $30 and leave the CPU at a cozy ~70C.

 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965


InvalidError you could do that, but then it wouldn't compete with intel processors in gaming at that price point. That's why I suggested the 1400 and the 212. That graph shows the Wraith spire, and not the actual cooler that comes with the 1400. The Wraith spire is a better cooler than the one that comes with the 1400.
 

InvalidError

Titan
Moderator

100MHz more or less makes hardly any difference in most game benchmarks and the Spire is only marginally better than the Stealth, only ~5C better in the comparisons that I have seen on 1400s OC'd to 3.9GHz, which is as high as many (most?) Ryzen chips will go without going stupid high on voltage and extra heat for next to no extra performance.
 

Retrogame

Distinguished
Aug 31, 2007
34
0
18,530
"Mediocre performance" impressions are exactly the thing to be cautious about. That's why follow up coverage about how to realistically match hardware is useful. OK, sure, the most expensive CPU + most expensive GPU lets you run a game at 170 fps on a cheap 1080p screen that doesn't even have freesync or G-sync and therefore you can't see it. Versus, cheaper CPU+cheaper GPU = "only" 90 FPS in the same game, and you STILL can't see it, since you've got a 60 hz panel.

Perhaps a middle ground. Initial release, fine, match the CPU with the most GPU power you can find. Then, follow up by testing a few common configurations.
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965


Actually, from the tests I've, which are numerous, Ryzen scales extremely well with just higher CPU clocks, as well as memory clocks for programs and games that utilize them. Ranges I've seen on actual overclocking are 3.9-4.1 across the board. And you have valid points about Ryzen requiring much higher voltage to get those last extra 1,2, or 300MHz. Also, of course that would free up up to get samsung B-Die RAM and overclock it to 3480MHz or 3600MHz, which could net you more performance depending on your motherboard.

"Best performance with Samsung B-die is achieved at 145 MHz REFCLK with 2400 DRAM ratio (3480 MHz) at 11-10-10-10-22 timings. Alternatively 135 MHz REFCLK with 2666 DRAM ratio and 11-11-11-11-22 timings. This depends on the CPU/MB/DRAM capability."
http://www.legitreviews.com/ddr4-memory-scaling-amd-am4-platform-best-memory-kit-amd-ryzen-cpus_192259/6

You can see on this graph with the R7 having an overclocked CPU and RAM does work well in some games.

2573qrr.png
 
I'm curious where this leaves the R3 in terms of target market. Normally you'd figure on the budget/office machine segment, but the lack of integrated graphics is a huge sticking point in terms of added cost. Trying to undercut a G4560 with enough left over for GPU doesn't seem feasible. Going after the i3 gaming crowd makes more sense, and should put a spotlight on cores vs clocks and single thread performance. Interesting to see what AMD chooses to do here.
 
@TMTOWTSAC - fairly certain the R3's are aimed at Intel's i3 range
The issue here is that when Intel dropped the hyper threaded pentium at near half the price of the i3 it was either incredibly stupid or incredibly clever.
I'm sure they took a big share of amd's of the lower end apu's while also massively cutting into their own i3 sales.

Very strange one at the really low budget end now, nothing touches the 4560 on a performance to price basis & doubtful anything ever will.
 

InvalidError

Titan
Moderator

AMD's Raven Ridge APUs will hopefully get in that territory.
 

s4fun

Distinguished
Feb 3, 2006
191
0
18,680


Yep. It is basically bulldozer, exacavator, pile of $hit driver all over again. AMD can try to distract us with that multi-threaded encoding crap, but it doesn't cut the mustard. For AMD to be obvious choice they need to be priced at the least 33% less than whatever they currently got. If they want to convince people to take chance on them, they need to reduce the risk to early adopters.
 

s4fun

Distinguished
Feb 3, 2006
191
0
18,680


Keep telling yourself that. Anyone from an objective perspective can see that you have had your money stolen from you. AMD has got the steal from you.
 

neblogai

Distinguished


Do not forget the R5 1400 has half the cache, which results in real life performance loss of ~5% in games (equal to R5 1500X at -200MHz max clock). 1400 + 212 combination is good, but there are equally good combinations with 1500X, and there is one combination is better than others (+ €25 and you get R5 1600 with Wraith Spire)
 

psiboy

Distinguished
Jun 8, 2007
180
1
18,695


You misunderstand my point. I'm saying DX12 is supposed to use moreCPU cores if you have them and this doesn't seem to be the case.
 

InvalidError

Titan
Moderator

What I was responded to was some bogus correlation between DX12 and AI. If a game developer used GPU-based AI, that would be using DirectCompute, not DX12. As for the rest, how much DX12 can possibly gain from extra threads or cores is largely limited by how efficiently threaded the software is in the first place. DX12 alone can't do a darned thing for software that isn't properly optimized for heavy threading.
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965


This is true. Look how well Ryzen performs on synthetic gaming benchmarks. It should be much more competitive vs. Intel than is happening in reality.
 

psiboy

Distinguished
Jun 8, 2007
180
1
18,695


So what your saying is DX12 is good but still needs to be coded properly for ... yes?
 

InvalidError

Titan
Moderator

Any change in the software and hardware stack usually requires that programmers figure out how to avoid possible quirks.
 
Status
Not open for further replies.