ryzen 5 2600x overclock to 5.0

Status
Not open for further replies.
Jan 15, 2019
5
0
10
i would like to over clock to 5.0ghz but am unable to do this on my own. i have it at 4.2 right now.
even 4.5 i would be happy with.

system spec:
win 10
MSI 470 pro motherboard.
R9 390 gpu
ryzen 5 2600x
G.SKILL TridentZ RGB Series 16GB ddr4 3200
i also have a liquid cooler. h100i cooler master.

i just am not sure what to change.
 
Solution


THIS is correct.
Ryzen has a very, very steep voltage/frequency curve meaning at a specific point, which varies slightly due to manufacturing differences between each CPU, the voltage increases significantly with each additional 100MHz so 4.5MHz becomes much, much harder than 4.4GHz. The COOLING requirement goes up significantly as the voltage increases too but even with sufficient cooling the voltage would be a problem.

Also, driving the voltage too high will DEGRADE your CPU over time. This is for Raven Ridge (2000 APU's) but I'd assume the desktop APU...


THIS is correct.
Ryzen has a very, very steep voltage/frequency curve meaning at a specific point, which varies slightly due to manufacturing differences between each CPU, the voltage increases significantly with each additional 100MHz so 4.5MHz becomes much, much harder than 4.4GHz. The COOLING requirement goes up significantly as the voltage increases too but even with sufficient cooling the voltage would be a problem.

Also, driving the voltage too high will DEGRADE your CPU over time. This is for Raven Ridge (2000 APU's) but I'd assume the desktop APU voltage Steve at Gamers Nexus applies is applicable to you... so as a quick note I suggest no more than 1.25V but watching the entire video (ignore iGPU references and substitute "SOC" with CPU):
https://www.gamersnexus.net/guides/3251-raven-ridge-soc-voltage-guidelines-how-to-kill-cpu-with-safe-voltage

*I can't overemphasize how important the video is to watch.

5.0GHz is fantasy land.

I also suggest going with 200MHz below whatever you crash at. If 4.5GHz causes crashing but 4.4GHz does not then drop down to 4.3GHz to ensure better stability... CPU's degrade slightly over time and the higher the voltage the faster they degrade... also, just because 4.4GHz may pass testing now there's no guarantee it's stable under various conditions such as fluctuating PWM output etc.

I assume this is for gaming. For most games, going above 4.2GHz won't make much difference with that CPU and the settings you'd get with an R9-390.

It would matter most for certain SHOOTERS running at 1920x1080 but those CPU bottlenecks with that GPU and those settings are most likely a high FPS such as 120FPS so even if you got a 10% higher overclock that extra 10%, if possible, would only get you from 120FPS to 132FPS (to oversimplify).

So... my advice is concentrate on STABILITY. Your gaming experience is actually more dependent on you learning how to tweak the game settings which depends also on your monitor refresh rate, resolution and whether you have Freesync.

*Quick example which I know worked with RadeonPro (and for which NVidia Control Panel has support) in a few games like Assassin's Creed Unity, GTAV, Max Payne 3 I force on Adaptive VSync (Dynamic VSync with RadeonPro is same thing).

On a 60Hz monitor if you enable VSYNC you lock to 60FPS (if you can output 60FPS) and buffer to stay in synch with the monitor refresh cycle. This is mainly to avoid SCREEN TEARING... unfortunately you'll get added STUTTERING which varies by the game if you have VSYNC ON but can't output sufficient FPS (below 60FPS on 60Hz monitor)… force on Dynamic/Adaptive VSYNC and VSYNC gets auto disabled as needed.

This basically swaps SCREEN TEARING for STUTTER which is usually preferable. You would tweak the game settings so that drops below the target are less than 10% of the time... in AC Unity (and Batman AK I believe) I would swing/jump between buildings which apparently increased the demand and the FPS would drop then go back up but I'd get a sudden, big STUTTER which ruined the experience a bit... with Adaptive VSYNC I got a small amount of screen tear. Far better. (Max Payne 3 would do this rare thing of dropping to HALF the refresh rate so it would drop to 30FPS VSYNC'd which is horrible as the lag would suddenly increase and the game was suddenly much more sluggish... Adaptive VSYNC would drop me into the 50's FPS range with VSYNC OFF instead with no obvious lag increase... in fact, slightly lower FPS but the VSYNC buffering is gone).

Other quick tips related to VSYNC:
1) 144Hz+ monitors
a) VSYNC OFF and optionally cap with a tool like NVInspector (or RadeonPro?)... high Hz/FPS ratio produces less obvious screen tearing since some frames have no changed data on screen. A slower game like Tomb Raider at 50FPS, VSYNC OFF on 144Hz screen works great.

b) Adaptive/Dynamic HALF Refresh will synch to 72FPS and auto disable VSYNC. Preferable if screen tear is annoying, so again you'd tweak so drops below 72FPS are not frequent.
 
Solution


alright thanks man. that helps a lot. i have it at 4.2 right now so i guess i should be happy with that.
 
Just FYI:
You're probably in the ballpark of my i7-3770K@4.4GHz for gaming for most games (obviously not highly threaded tasks). Possibly slightly better in gaming but if so not by much by my tests. I have a GTX1080 and have tested lots of games and don't have much difficulty getting at least 60FPS on High/Ultra at 2560x1440 in most games.

If curious about the ONLY way you can estimate where the bottleneck occurs in gaming is to:
1) setup on OSD to display GPU USAGE, and
2) turn VSYNC OFF to not cap the FPS

(you can't go by CPU usage because a game will rarely use all the cores even if the CPU is the bottleneck. In StarCraft 2 it's either two threads or two cores so if it's two THREADS and if Windows treats all threads equal, which they are not, then each "thread" of the CPU is 1/12th the total performance on your CPU... thus if a game could only use TWO of them that's only 1/6th the CPU capability plus whatever Windows etc uses... thus StarCraft 2 if the above is accurate might only show around 20% usage when the CPU is the bottleneck)

To oversimplify because GPU USAGE isn't 100% accurate:
If the GPU shows 95% then there's very little CPU bottlenecking. At 100% there's none.

If it showed 80% then you have a big CPU bottleneck and with a 6-core CPU it's unlikely additional cores/threads would help then it's the performance of each individual core and thus OVERCLOCKING should have an almost proportional effect on performance (i.e. 25% overclock should get you to roughly 100% GPU performance).

I don't monitor games but since changing game SETTINGS usually increases the FPS for me it's almost certain that I don't usually have a CPU bottleneck. Most settings hit the GPU harder such as anti-aliasing, resolution, textures etc. (in some rare cases like Bethesda's Prey your STORAGE drive may be the bottleneck... I moved Prey from an HDD to an SSD and the horrible, horrible, and bad stutter just disappeared. Literally was stuttering constantly. WTF? It's texture loading but why it did that with unused VRAM is very poor programming especially with a modern game.)

Also, not sure what MONITOR you have but FREESYNC works really well on a quality Freesync monitor. Unfortunately the good high refresh, HDR models are expensive and I don't know if they even have OVERDRIVE (the cheaper ones don't AFAIK) which helps a bit with color shift but still a good 144Hz, 2560x1440, IPS monitor works really well...

Example: https://pcpartpicker.com/product/c298TW/asus-monitor-mg279q

The high refresh also makes DESKTOP usage feel a lot smoother in addition to avoiding the need to use VSYNC below 144FPS (if I had such a monitor I'd just set a GLOBAL CAP at 140FPS to stay in Freesync mode all the time).
 
Status
Not open for further replies.