$200 GPU face-off: Nvidia RTX 3050, AMD RX 6600, and Intel Arc A750 duke it out at the bottom of the barrel

I can't see buying any of these to play "real" games on, I would beg, borrow or steal to get up to a 9060XT 8G or 5060 before considering something this low, or even stick to a console.
 
  • Like
Reactions: Nitrate55
Yeah, the low end GPU market is in horrible shape.
Adding AV1 encode/decode to an AM4 X3D system is very costly, unless you can find a cheap Arc A300 card.
AMD really screwed the pooch with their RX6400/6500 cards.
 
I can't see buying any of these to play "real" games on, I would beg, borrow or steal to get up to a 9060XT 8G or 5060 before considering something this low, or even stick to a console.
Be like me. PC is far more for work, it does not worth it to upgrade PSU, new Tomb Raider remake announced, so I have upgraded from RX 6500 XT to RX 6650 XT.
 
I didn't see where this review specified what hardware the cards were benchmarked with, which I feel is a serious oversight.

It was mentioned that the Intel card runs worse on older CPUs, and given that people looking at old gen GPUs are likely to also be using old CPUs, that could be a real issue.
 
This is a case when you should spend a little more to gain a lot more. Case and point the RTX 5070 (which I just bought today) can be found for $550 At Best Buy, and provides over twice the performance of the A750 while supporting DLSS4 and other technologies that the A750, and even the RTX 3050, does not, as well as it has the power for years to come, all for $250 more, a very smart way to spend a little extra money and gain a lot of utility.

average-fps-1920-1080.png



Yes it sucks that the $300 market for new GPUs is pretty much a no-go zone, but we're also at the time when the $500 market has never given you more power.
 
  • Like
Reactions: shady28
I didn't see where this review specified what hardware the cards were benchmarked with, which I feel is a serious oversight.

It was mentioned that the Intel card runs worse on older CPUs, and given that people looking at old gen GPUs are likely to also be using old CPUs, that could be a real issue.

We're currently on a temporary Core i9-13900K/Z790 testbed, which won't be the permanent foundation of our GPU testing going forward. To your point, though, it is not indicative of the potential performance loss one might see with an Arc GPU and an older system.
 
I can't see buying any of these to play "real" games on, I would beg, borrow or steal to get up to a 9060XT 8G or 5060 before considering something this low, or even stick to a console.
What would a real game qualify as?

If you're playing esports titles, you generally don't need a high end card to play at lower settings that most people would be using regardless of GPU specs. If you're playing supposed "triple A" titles, you would probably be lowering settings for a specific fps target you're comfortable with.

There are so many games in between the lowest and highest system hardware demanding games, that nothing is a one size fits all until you are buying a GPU that is at least as powerful as the RTX 5070 TI. There is also a massive catalogue of old games that people can be playing on these cards and get great performance. Not every new game is going to run like garbage either.

People get so bent out of shape over GPU performance in reviews, when half the point of PC gaming is about available options for hardware on the market and graphics settings in games. Tweaking a games settings to suit your PC is part of PC gaming culture.
 
Sold my Radeon RX6600XT 8GB for $50 AUD... Just wanted to get rid of it ASAP as it was terrible for Oblivion Remastered.
The era of 8GB cards is over.

Was a great card when I bought it in 2021, 4 years ago during the Crypto Boom (Using money from the sale of my RX 580 8GB as a cheap upgrade.)

But it's performance in newer games is a struggle.

But for budget systems? You are better off with a RTX 3050 if you need a low profile, single slot GPU.
Otherwise you can often find the RX 7600 8GB in the same price ballpark if you shop around for a sale... Which would be my pick if you can't afford a 10GB/12GB/16GB GPU...
But there is another option.

The Radeon RX 6700 can be had fairly cheap second hand which is a much better GPU.
 
With the retail prices being what they are it feels like sub-$300 is a place to look used. 8GB VRAM cards would be minimum to consider these days since we're getting more games with that as minimum instead of 6GB (I thought 6GB barrier for entry would stick around longer).

For older systems I don't really think Intel is a viable choice, but anything 10th Gen/Zen 3 or newer ought to be fine. Used RDNA 2 is likely the best choice these days with the 6600-6700 cards.

If MSRP were real the B570/580 would be the best budget choice period.

If someone is looking for a more budget oriented card that's going to last I think the 9060 XT 16GB is worth saving for. FSR4 is actually usable at 1080p if needed and is pretty good at 1440p+ and it won't run into VRAM issues for the foreseeable future.
 
What would a real game qualify as?

If you're playing esports titles, you generally don't need a high end card to play at lower settings that most people would be using regardless of GPU specs. If you're playing supposed "triple A" titles, you would probably be lowering settings for a specific fps target you're comfortable with.

There are so many games in between the lowest and highest system hardware demanding games, that nothing is a one size fits all until you are buying a GPU that is at least as powerful as the RTX 5070 TI. There is also a massive catalogue of old games that people can be playing on these cards and get great performance. Not every new game is going to run like garbage either.

People get so bent out of shape over GPU performance in reviews, when half the point of PC gaming is about available options for hardware on the market and graphics settings in games. Tweaking a games settings to suit your PC is part of PC gaming culture.
Just to add to this, at the end of 2022 I bought a RX 6600 to use in a build for my mother, who found a copy of Dragon Age: Origins in a discount bin and thought it looked neat, and had been playing that, Dragon Age 2, and Dragon Age: Inquisition on her aging prebuilt "Home and Office" PC and it's unremarkable 1080p60 monitor. She was absolutely floored by how well the RX 6600 ran that, then ended up picking up a couple other fantasy RPG games and bouncing off them, before falling into Baldur's Gate 3 big time, which she thinks looks AMAZING. The father of one of the friends I game with bought himself a 6600 as well in 2023(?) because he was into D&D WAY back in the day and wanted to play BG3, then got his brother one so they could play together.

Online tech discussion seems to either discount all this as "not real gaming", despite that AMD sold us all video cards for the purpose of playing games, or insist that we all need at least an RTX 4070Ti and preferably more and all these systems are horribly underspecc'ed and on the verge of obsolescence, which feels out of touch with expectations of the people actually buying these cards. It doesn't need to run maxxed out with RT on at 1440p or 4k, and turning down some settings is fine as long as the game is fun. Why try and gatekeep PC gaming if people are having fun?

The RX 6600 still being $200 in 2025 is a bit disappointing, and you could do better on the used market... but a new card has known provenience, regular availability, and ships quick from familiar retailers, and enthusiests on a budget may enjoy the challenge of trying to min-max every dollar, but not everyone who wants to play games wants to do that.
 
The guy above has a point. My first few graphics cards were, relative for the time, much less capable than these. And before that I had a few pcs that ran off of motherboard graphics.
The crappy cards in this comparison will play any game at low settings at worst. They all do the job. 10 years ago you could not say the same about low end cards. 5 fps on lowest settings was a thing.
 
  • Like
Reactions: Amdlova
I can't see buying any of these to play "real" games on, I would beg, borrow or steal to get up to a 9060XT 8G or 5060 before considering something this low, or even stick to a console.
RTX 3050 or RX 6600 and 8 GB is enough for plenty of 1080p gaming. Paying $220 for it is a slap in the face though, with the 9060 XT 8GB or 16GB yielding better price/perf at MSRPs if you shell out the extra cash. "New" GPUs that are out of production don't make much sense anymore, the used market is where to penny pinch.
 
"The RX 6600 is also the single most popular Radeon graphics card in the Steam hardware survey, so we’re surprised this issue is still around."

I see this a lot and there's got to be a better way to word it. A "popular" graphics card on Steams list makes it sound like a card that was specifically sought out, when more than likey was the best option for the buyer.
 
all for $250 more, a very smart way to spend a little extra money
It's refreshing to read a review about more modestly priced GPUs after the RTX 5090 and RTX 5080.

I have friends in Third World countries who don't earn $250 per annum, which is why I gift them second-hand laptops. They get paid in kind or barter farm produce, so $250 cash is an absolute fortune.
 
  • Like
Reactions: JTWrenn and Amdlova
But for budget systems? You are better off with a RTX 3050 if you need a low profile, single slot GPU.
As per the comparison, why would you prefer a slower card? I'd recommend the RX 6600 if buying new, any time of the day. Low prifle, sure, pick the 3050. But for everything else, pick the Radeon. Budget gamers need first and foremost price per raster performance, and the Geforce just doesn't have it (the Arc does, but 200W may be too much for budget PSUs).
 
  • Like
Reactions: jlake3