THIS is correct.
Ryzen has a very, very steep voltage/frequency curve meaning at a specific point, which varies slightly due to manufacturing differences between each CPU, the voltage increases significantly with each additional 100MHz so 4.5MHz becomes much, much harder than 4.4GHz. The COOLING requirement goes up significantly as the voltage increases too but even with sufficient cooling the voltage would be a problem.
Also, driving the voltage too high will DEGRADE your CPU over time. This is for Raven Ridge (2000 APU's) but I'd assume the desktop APU voltage Steve at Gamers Nexus applies is applicable to you... so as a quick note I suggest no more than 1.25V but watching the entire video (ignore iGPU references and substitute "SOC" with CPU):
https://www.gamersnexus.net/guides/3251-raven-ridge-soc-voltage-guidelines-how-to-kill-cpu-with-safe-voltage
*I can't overemphasize how important the video is to watch.
5.0GHz is fantasy land.
I also suggest going with 200MHz below whatever you crash at. If 4.5GHz causes crashing but 4.4GHz does not then drop down to 4.3GHz to ensure better stability... CPU's degrade slightly over time and the higher the voltage the faster they degrade... also, just because 4.4GHz may pass testing now there's no guarantee it's stable under various conditions such as fluctuating PWM output etc.
I assume this is for gaming. For most games, going above 4.2GHz won't make much difference with that CPU and the settings you'd get with an R9-390.
It would matter most for certain SHOOTERS running at 1920x1080 but those CPU bottlenecks with that GPU and those settings are most likely a high FPS such as 120FPS so even if you got a 10% higher overclock that extra 10%, if possible, would only get you from 120FPS to 132FPS (to oversimplify).
So... my advice is concentrate on STABILITY. Your gaming experience is actually more dependent on you learning how to tweak the game settings which depends also on your monitor refresh rate, resolution and whether you have Freesync.
*Quick example which I know worked with RadeonPro (and for which NVidia Control Panel has support) in a few games like Assassin's Creed Unity, GTAV, Max Payne 3 I force on Adaptive VSync (Dynamic VSync with RadeonPro is same thing).
On a 60Hz monitor if you enable VSYNC you lock to 60FPS (if you can output 60FPS) and buffer to stay in synch with the monitor refresh cycle. This is mainly to avoid SCREEN TEARING... unfortunately you'll get added STUTTERING which varies by the game if you have VSYNC ON but can't output sufficient FPS (below 60FPS on 60Hz monitor)… force on Dynamic/Adaptive VSYNC and VSYNC gets auto disabled as needed.
This basically swaps SCREEN TEARING for STUTTER which is usually preferable. You would tweak the game settings so that drops below the target are less than 10% of the time... in AC Unity (and Batman AK I believe) I would swing/jump between buildings which apparently increased the demand and the FPS would drop then go back up but I'd get a sudden, big STUTTER which ruined the experience a bit... with Adaptive VSYNC I got a small amount of screen tear. Far better. (Max Payne 3 would do this rare thing of dropping to HALF the refresh rate so it would drop to 30FPS VSYNC'd which is horrible as the lag would suddenly increase and the game was suddenly much more sluggish).
Other quick tips related to VSYNC:
1) 144Hz+ monitors
a) VSYNC OFF and optionally cap with a tool like NVInspector (or RadeonPro?)... high Hz/FPS ratio produces less obvious screen tearing since some frames have no changed data on screen. A slower game like Tomb Raider at 50FPS, VSYNC OFF on 144Hz screen works great.
b) Adaptive/Dynamic HALF Refresh will synch to 72FPS and auto disable VSYNC. Preferable if screen tear is annoying, so again you'd tweak so drops below 72FPS are not frequent.