Radeon R5 230 or R7 240 versus Intel graphics

PC-Cobbler

Distinguished
Jun 12, 2015
47
0
18,530
I have some Sandy Bridge systems for which I'm thinking of upgrading the graphics. These will not be gaming systems, though the user might play a game now and then. I'd appreciate comparisons between Sandy Bridge (2000 or 3000) and AMD's Radeon R5 230 / R7 240. Please include technical details: 128-bit / 64-bit; memory of video card; WEI rating; OS (these are running W-7); any third-party benchmarks. Thanks in advance.
 
Solution
I suggest throwing the spec sheet out the window as well as most synthetic benchmarks. Unless you know more in depth info of what you are looking at, comparing specs is more misinfo than anything else since specs can't be compared with different architectures. 128bit being better than 64bit is correlation not causation. You can also see a 730 64bit gddr5 beat a 240 128bit gddr5 by quite a bit. R&D will put a wide enough bus for what the gpu needs and different memory algorithms will affect bandwidth usage. For giggles, a 9800gt 256bit loses to the 730 64bit. Clocks can't be compared when cores do different amounts of work at the same speed (same for cpus). 384 kepler cores actually loses to 192 fermi cores (which is also older) so can't...
While I can't provide any technical data on any of the aforementioned GPUs, I can tell you that both the R5 230 and R 240 will perform very poorly in games. Neither of them really perform much better than Intel's iGPU. A GTX750 or GTX750Ti would be a much better option.
 



Thanks for the reply, but 750s would cost substantially more money, not to mention that they would be double slot, something these PCs cannot accommodate.
 
The thing is that class of videocard was meant for people who didn't have integrated graphics and couldn't justify spending the money on a gaming card. If you already have integrated graphics there's no point in buying a card like that. Of the HD 2xx series, gaming performance doesn't start till the 250x.

If your integrated graphics doesn't do what you need it to do then you need to be more specific about your needs. I've had this conversation in real life:
"I need an upgrade of my graphics"
"You have integrated graphics, so then you need a gaming card?"
"No, I'm not really going to be gaming"
"Then just use the integrated graphics."
"But I need an upgrade."
"For what, games?"
"No games"
repeat until insanity sets in. Spelling out your needs avoids this.
 
That was a good answer. I did the necessary research, but I did not state it before.

The PCs only have one free slot, one designed for PCIe 2.0 x16. The slot next to the graphics one is being used for USB 3.0 and cannot be removed, so I'm constrained to cards which only occupy one slot. This eliminates GT 740 and beyond. Not to mention that serious gaming cards (PCIe 3.0) often have trouble with PCIe 2.0 slots.

The users would like to play *some" games, but they were informed of the two constraints, cost and slot-restriction. They understand that they will not be playing games at high settings.

I know that some NVIDIA cards would meet the requirements, as there are a number of one-slot GT 730 cards still available. But I am unfamiliar with AMD Radeon and wanted to see what the comparable cards were. Radeon 6450 and R5 230 cards are still available in one-slots, but I suspect their performance is not acceptable. One or two Radeon R7 240s are offered in one-slots, but all R5 250s and beyond are two-slots.

Sandy Bridge 3000 graphics give a WEI of 6.2/6.2. I cannot remember the exact WEI for Radeon 6450, but I think it was low 6s for games and 4s for desktop. An old GT 430 gives a WEI of 6.7/6.7, which would be a significant improvement for either Intel 2000 or 3000. It would be a good comparison to have the WEIs for R5 230 and R7 240, hence why I asked.

I think I'll just compare graphics clocks, 64-bit vs 128-bit, and graphics memory.

UPDATE: By searching on different keywords, I found a Tom's thread which included opinions that R7 240 WEI is near 6.9 with DDR3 and 7.1 with GDDR5. And game-debate.com stated: "[Intel Graphics 3000 offers] performance similar to a dedicated GeForce GT 610 ... Radeon R5 230 is a direct re-brand of Radeon HD 6450 v3. Gaming benchmarks put its performance around of a GeForce GT 420." So R5 230 is the same performance as Intel 3000 and R7 240 is 0.7-0.9 better on WEI.
 



I agree that it is not as good as some others. Knowing the fps for a specific benchmarking utility is better. However, if you'll notice, no one volunteered any fps data, not to mention that WEI is often mentioned in customer reviews so it can be found via searches.
 
Here's a single slot R7 350, at Newegg. You don't have to shop at Newegg of course, this is just to show you what's available. This card will do better than the lower end 2xx AMD cards. It's got a 128bit memory bus, so it would even be a better choice than the Nvidia 64bit DDR5 cards. Fifty dollars is about as cheap as a gaming card gets.
 
I suggest throwing the spec sheet out the window as well as most synthetic benchmarks. Unless you know more in depth info of what you are looking at, comparing specs is more misinfo than anything else since specs can't be compared with different architectures. 128bit being better than 64bit is correlation not causation. You can also see a 730 64bit gddr5 beat a 240 128bit gddr5 by quite a bit. R&D will put a wide enough bus for what the gpu needs and different memory algorithms will affect bandwidth usage. For giggles, a 9800gt 256bit loses to the 730 64bit. Clocks can't be compared when cores do different amounts of work at the same speed (same for cpus). 384 kepler cores actually loses to 192 fermi cores (which is also older) so can't go by core count or if one gpu is newer. I think you get the point. Ms removing wei in 8.1 and later was one of the best things they did. I never saw any professional review give wei scores.

http://www.tomshardware.com/reviews/gpu-hierarchy,4388.html Tom's has this easy to see hierarchy chart based on average performance in a number of tested games. Very convenient. It is missing some rebranded cards like a 610 but it's on there as a 520. Techpowerup has nice graphs and compares different res too but you don't see low end card reviews too often on there.

That 350 is single slot but a dual slot cooler which will not work. Many single slot gpus stopped being produced so a 730 is the best I could find right now.
 
Solution



Thanks a bunch for the chart reference. That is a great resource for the reasons you mentioned and is exactly what I hoped to find. Only a magazine would ever have a large number of video cards to test. It is missing NVIDIA's GT 420, but it's probably the same as the GT 520, which is included.

Originally I wanted to solve a specific problem, that of updating some Sandy Bridge PCs, but this thread ended up being educational. Thanks to everyone who tried to help. If anyone wants to add more comments, I will be happy to read them, but I'm marking this thread as answered.
 



Yes, I saw that. That's actually the card which first got me thinking about removing the fan and housing and using the case fan to cool it. I'm still on the fence about it because the Rx 3xxx series is only 1.5 years old, while the Rx 2xxx series is more than 3 years old -- actually, since they're rebadged 7000 series, they're 2011-12 vintage, same as Sandy Bridge -- meaning that the former has much more chance of being incompatible with Intel 6-series boards. Every generation adds that much more risk.
 



The literature is confusing. I read through what AMD has and then perused Wikipedia's best, especially "AMD Radeon Rx 300 series," but somehow I missed that Rx 3xx is Graphics Core Next microarchitecture which was launched in 2011. So R7 3xx is simply a continuation of R5 2xx and R7 2xxx and should be okay. As I said at the very beginning, this is my foray into AMD products. That R7 350 is cheap enough to take a risk. Thanks for the pointers.
 
Gcn has different gens with varying degrees of improved performance. The most recent gen is quite a bit better, similar to a new architecture. This kinda goes back to not getting too bogged down on specifics because if you don't know what you are looking at, you are going to trick yourself with misinfo. The rx 300s are almost all rx 200s rebranded though.