Why Raven Ridge is awesome

DeadnightWarrior

Distinguished
Jan 13, 2008
64
2
18,545
Just my 2 cents...

I was wondering if I'm getting excited for nothing, or if it's indeed a small miracle.
While I was playing games like Doom, Mass Effect Andromeda, The Witcher 3 and Fallout 4 on my Ryzen 5 2400G rig, at a certain point I realized I had forgotten having no graphics card!

True, I'm using a low 720p resolution and I did have to invest on ram but seriously, do you remember any other iGPU capable of running games ot that caliber, even at high details and without breaking a sweat? Yeah, Doom 2016 runs at high details!

Up until now, an APU could work fine for e-sports titles and little more but now, here's a processor that can effectively wipe out the entire low end video cards category.

These are of course just a couple of my personal feelings about this, it's been years since I found myself amazed at a CPU potential (and it's the very first time I'm actually using a pc with no graphics card as my main system!).

What do you think about this little wonder called Raven Ridge?
 


Tell us about the rest of your system. What motherboard are you running with? Which memory, how much and at what speed?

Have you overclocked any part of it (CPU/GPU/RAM)?

Witcher 3 at high detail even 720p does seem amazing on an iGPU. Have you tried 1080p gaming? How much do you have to compromise on detail settings?

Are you playing Doom with Vulkan? If so how is it with OpenGL?
 


I have a Gigabyte AB350 Gaming, 16 Gb of G.Skill TridentZ 3200 ram (Hynix A-die, no problem @3200) and an M.2 nvme drive.
I actually can't try 1080p because of my display which is an old 5:4 1280x1024 model.
I tried a mild overclock on the CPU (no more than 200 Mhz) and I didn't touch the GPU. The Wraith Stealth wouldn't allow much room to be honest.

Doom is absolutely Vulkan driven, I haven't tried OpenGL yet (I heard some awful perfomance reports about that).

As for the other games I played, I can say Mass Effect Andromeda runs fine at medium details. Some stuttering here and there, but you have to consider it is quite a badly optimized game: when I had 8 gigs of ram it would crash every five minutes or so, while now with 16 gigs it seems stable.

Fallout 4, no problem whatsoever, medium/high details and smooth as silk.

The Witcher 3 is maybe the most difficult to run but with things like hairworks off, it's still a playable experience.
 
First, do note that you're playing on a lower resolution which puts a lower load on iGPU and higher one on CPU.

Vega 11 has similar performance to GTX 650 Ti which you can buy for 30-50$ on eBay.
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-650-Ti-vs-AMD-RX-Vega-11-Ryzen-iGPU/2189vsm401440

We can also buy a used GTX 970 off eBay for 150$ which crushes the Vega.
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-970-vs-AMD-RX-Vega-11-Ryzen-iGPU/2577vsm401440

As for why we don't have more awesome APUs, Intel isn't interested in them and AMD has a foundation for producing them because of the graphics card branch. Also, APUs can cause problems because of being integrated to CPU. Like overheating, BIOS corruption, OC problems, .... BTW, any decent GPU crushes integrated at render because it requires actual cores to do it even if it fails at the game performance.

Personally, I prefer paying less for a CPU without iGPU and more on a dedicated graphics card.

As for the other games I played, I can say Mass Effect Andromeda runs fine at medium details.
I ran Andromeda with my older Gigabyte GTX 750 on high without AA at 1600-900 with 50-60fps. If you consider that unoptimized, I'm would be speechless. I had an 8gb DDR3 RAM then. Would use about 7.5GB.
 


I'm very much aware that any decent graphics card would be drastically better but that was not my point: a GTX 970 was and still might be considered a medium / high performance card, it's on a different league than any iGPU.

What's interesting to me is that we can have a 4 cores / 8 threads decent CPU AND a decent low end graphics "card" for a little more money than a similarly perfoming GT1030 or RX550 alone.

What came before the Ryzen APUs was more or less a choice between a strong CPU with terrible graphics (Intel) or a weak CPU with almost decent graphics (AMD). Now you have both parts going good and for a reasonable price. THIS is what makes these chips relevant to me.

However I checked a lot of reviews and benchmarks before buying: performance at 1080p looks nice enough and, for what I could see, the 2400G runs nearly on par with the current low end GPUs. I never had one of those but I can definitely say the Vega 11 side is leaps and bounds better than my old Radeon HD7750.

As for Andromeda I don't really know, I tried to play it on my old system at low / medium settings (Athlon II X4 860K, 8Gb 1866 DDR3, HD7750, SSD and so on) and I couldn't even finish the tutorial. It was basically a slideshow crashing every few minutes. With Raven Ridge things got a little better but I guess 8Gb of total memory was not enough and that's maybe why with 16Gb I can finally play it without issues.
 


What's really compelling: Certainly a buyer of a budget starter system with a 2200G or 2400G will find it fun for gaming at low settings. But once hooked they'll be able to pop in something like an RX580 or even GTX1070ti in that empty PCIe x 16 slot for a genuine upgrade allowing high-to-ultra settings while not limited by the CPU (at 1080p). And they don't have to feel like they were cheated in the process because they don't remove and throw away one of those un-resellable 'white-box' cards they bundle with those other budget systems.

That's a win-win-win scenario for customer satisfaction in my way of looking at it.

 
While the Raven Ridge APUs do have some disadvantages, they make great lower end gaming machines. The biggest disadvantage is that dropping in anything more than a 1060 6GB will bottleneck the GPU... and it isn't the performance of the CPU that does it, it is the limit to half the PCI-E lanes. The on chip Vega graphics use 8 PCI-E lanes that would normally be routed to the GPU slot. The problem is that the design means that those are 8 PCI-E lanes that can't go to what is normally a 16 lane slot. Putting in something like a 1070 or 1080 will result in significant performance loss compared to a card running on a Ryzen with the full 16 lanes available to the GPU.

Sadly, the Raven Ridge chips can not be anything more than a mid-tier to upper-mid-tier gaming machine. They are great for getting your foot in the door or as a stepping stone to get to a higher end system, but not ideal for someone looking to drop a 1080ti into the system in a year and have a high end gaming rig.

All that said, I do love them. They are fantastic chips with good graphics onboard. They are great for budget builds, HTPCs, and low to mid range gaming. The only drawback is that the lack of PCI-E lanes for graphics kinda sours the deal down the line. The CPU will long outlast the on chip graphics, but you'll never get your money's worth out of anything over a mid-range graphics card of today.
 


I think it's pretty much been demonstrated 8 PCIe lanes imposes no meaningful impediment to GPU performance vs. 16.

One among many: https://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus
 


Actually, I've seen videos by Hardware Unboxed and (I think) TechYesCity that show a performance drop. They tested the 2200G vs a Ryzen 1300x on various video cards and showed that you take a performance hit on anything over a 1060 if you use it with the APU. Can't for the life of me find it right now, but I know they made the point that the 1060 was the best choice with a minimal hit. Performance dropped on the APU by something like 12% on the 1070 or 1080.

It isn't just the 8 leftover lanes, but the architecture as well involved in it. I'm not discounting Gamers Nexus on this. Steve does good work, but he never tested this scenario.

Now, this was months ago and they may have improved whatever was causing this with updates, but I know for sure that the performance loss was reported as being there.
 


Are you kidding us or yourself?
the 16 Gb of G.Skill TridentZ 3200 ram (Hynix A-die, no problem @3200) alone costs around $200
As you stated yourself it would crash every 5 minutes with "only" 8Gb and the igpu would also be slower with single channel ram and/or slower ram.
Sure it runs fine enough for 720 but the cost is ridiculous.
 


Well, let's just say that if I had a beast like the 1080Ti, I probably wouldn't go with an APU in the first place, but maybe something like an R7 or i7 to say the least 😉
 


To contrast my point, the Ryzen 5 2600 would be a great CPU to drop a 1080ti into and have a high end gaming machine, and the 2600 isn't much more expensive (like $10 more expensive last I saw), you just need a graphics card which drives up the initial price a bit.
 


Not really. True, this is a terrible time to buy fast ram because of the inflated costs. However I could have gone with something lower (2800 for example) and have only a tiny percent drop in performance. G.Skill's own Ripjaws V @3000 would generally work great and cost 25% less than my TridentZ.

As for the crashes, those were with one specific game (Andromeda). No issues with any other game I tried.

Just to say, here's a series of benchmarks done with the 2200G (slower than the 2400G with both CPU and GPU): https://www.youtube.com/playlist?list=PL5-PBA-wl9UtAe6lq8bGvUn_389xYnaJI

And here's a test of the same 2200G paired with 8Gb of 2400Mhz ram with Battlefield V beta: https://youtu.be/ba41of7B00o

I'd say 1080p performace is impressive for a 99$ chip.
 


Just checked my local Amazon and the 2600 is actually 200€ against 160€ for a 2400G. Obviously the 2600 is on an another level and I'd say it needs at least a 1060 / 560 to show its potential.

To be honest, my initial plan was to throw in a 1050Ti or an RX560 or anything in that league down the line, but while I'm stuck with my current display there's really no need to add a graphics card. The iGPU suits me fine and still it's good to know it will support a decent GPU if and when I need one.
 
Just checked my local Amazon and the 2600 is actually 200€ against 160€ for a 2400G. Obviously the 2600 is on an another level and I'd say it needs at least a 1060 / 560 to show its potential
You should compare them like this:
1050 Ti/560
1060 3GB/570
1060 6GB/580
Not really. True, this is a terrible time to buy fast ram because of the inflated costs. However I could have gone with something lower (2800 for example) and have only a tiny percent drop in performance. G.Skill's own Ripjaws V @3000 would generally work great and cost 25% less than my TridentZ.
Another reason I recommend people to embrace overclocking.
My first RAM
HDdtu5u.png

My second set.
HDdty0O.png

These benchmarks are old though. I didn't touch the tertiary + IO-L "timing" by then and I didn't boost the I/O voltage. Will reach 80% by now.

BTW, I term it sadomasochism overclocking. Inflicting BSODs like a sadist and enjoying them like a masochist.
To be honest, my initial plan was to throw in a 1050Ti or an RX560 or anything in that league down the line, but while I'm stuck with my current display there's really no need to add a graphics card. The iGPU suits me fine and still it's good to know it will support a decent GPU if and when I need one.
The problem is, I have dabbled in rendering/encoding materials and 4K resolution. Once you get experienced with them, you start gettings sensitive to artifacts and low graphics. I would work overtime to prevent my OCD from giving me an eyesore.
 


Oh well! I might not go *that* far but my first ram purchase was a pair of 8Mb SIMM sticks for my almighty Pentium 133! 😀
 
Overclocking isn't free, you're paying more in terms of electricity bills and they accumulate over time.

Rough estimate, a computer consuming 400W working 8 hours a day would cost roughly $150 in electricity per year in US of A. Overclocking add another 60-100W to it, that's roughly $30-40 per year roughly 10% the cost of a top dollar gaming cpu.


 
We add 100W for RAM, HDD, SSD, MB, efficiency and other rounded stuff. (based on your numbers = 12.8337 cent)

4770K@3.5GHz + 1060 6GB ref stock + 2x8GB RAM stock = 84 + 120 + 100 = 304W = 114$

4770K@4.6GHz + 1060 6GB ref OC + RAM OC = 150 + 140 + 110 = 400W = 150$

30% faster CPU + 10% GPU + 30% faster RAM = 30% (36$) more power usage. It's worth the cost for me.

Also, any good full-time job can easily earn you 30$ in a day. In an hour if you're doing specialized stuff. I earned 500$ a month tutoring juniors back in university.
--------------------------------------------------------
BTW, I meant RAM OC because Ryzen benefits from faster RAM and we were joking about RAM size too. Uses about 5-10W more. That's 2-4$ rounded in a year. 😉
 
it's starbucks type of money...

if you consider amortization & depreciation, the total cost of ownership of a home gaming computer is extremely low. the parts don't break if taken care of properly. many of which can be sold after a 3-5 year timespan for roughly 60-70% of the original price. even better if you buy used goods.

 

TRENDING THREADS