Nvidia RTX 3050 vs AMD RX 6600 faceoff: Which GPU dominates the budget-friendly $200 market?

'We're calling pricing a tie since neither GPU costs significantly less than the other. Of course, the AMD GPU was already declared the winner in performance, meaning it's a better value overall at the same price, but we already gave AMD credit for the performance category.'

I'm not one to go all fanboy over faceless corporations, but I can't understand the reasoning used in parts of this comparison. The 3050 was bad value from the moment it was released, and it's still bad value now.

The above part sums it up for me: why the aversion to giving credit to AMD when the performance of the 6600 overall is at times vastly better than the 3050 for a similar or even lower price? Because 'that wouldn't be fair to the 3050' is what it reads like to me. It makes no sense. Any extra 'features' or 'technology' the 3050 has can't make up for that level of discrepancy. You don't seem to have a similar problem with certain other product comparisons. Not to mention that the performance difference between the two cards may equate to going from a jittery mess on one to relatively smooth on the other.

I hope you're not using the overall geomean result as a justification for making price a tie - in general, raytracing performance is too low to be considered worthwhile for either of them. You make mention of this in the article, notably with the 3050 being 72% faster at 4k; a lot of that can be attributed to just Diablo 4, but it seems the performance margin is still being counted as valid. These are NOT 4K cards, even with rasterisation.

---

I've just read all that back to myself and realised I've gone off on a rant. And I don't care. I can't help it: I honestly think the 3050 is one of the worst value products Nvidia has ever spawned, and hate that so many people have blindly flocked to it. It's nonsense like that which enables companies to justify higher prices.
 
'We're calling pricing a tie since neither GPU costs significantly less than the other. Of course, the AMD GPU was already declared the winner in performance, meaning it's a better value overall at the same price, but we already gave AMD credit for the performance category.'

I'm not one to go all fanboy over faceless corporations, but I can't understand the reasoning used in parts of this comparison. The 3050 was bad value from the moment it was released, and it's still bad value now.

The above part sums it up for me: why the aversion to giving credit to AMD when the performance of the 6600 overall is at times vastly better than the 3050 for a similar or even lower price? Because 'that wouldn't be fair to the 3050' is what it reads like to me. It makes no sense. Any extra 'features' or 'technology' the 3050 has can't make up for that level of discrepancy. You don't seem to have a similar problem with certain other product comparisons. Not to mention that the performance difference between the two cards may equate to going from a jittery mess on one to relatively smooth on the other.

I hope you're not using the overall geomean result as a justification for making price a tie - in general, raytracing performance is too low to be considered worthwhile for either of them. You make mention of this in the article, notably with the 3050 being 72% faster at 4k; a lot of that can be attributed to just Diablo 4, but it seems the performance margin is still being counted as valid. These are NOT 4K cards, even with rasterisation.

---

I've just read all that back to myself and realised I've gone off on a rant. And I don't care. I can't help it: I honestly think the 3050 is one of the worst value products Nvidia has ever spawned, and hate that so many people have blindly flocked to it. It's nonsense like that which enables companies to justify higher prices.
I get it, and im in agreement with you, the RTX 3050 was always a bad value proposition, and that hasn't changed just because its now tied in pricing with a card that is still generally faster where it counts. Sure you can talk about ray tracing etc all day long, but for cards of this class it doesn't matter, they both suck at it and are not really useful for that application. This one should have gone to the 6600 without any reservations, as the people that are buying these are generally just looking to get into 1080p gaming, and neither card can really ray trace. These are the charts that really matter, and almost across the board, with very few exceptions, the RTX 3050 gets its teeth kicked in, in some cases it ties, but even those are few.

5G4WGPkfK5UMA3MkzJf6ci-970-80.png.webp


UTFrX4yeDuGA5ZAKPGuCEi-970-80.png.webp
 
I'd have to say the unmentioned $200 A750 wins this round.
From 2022:
enGp5YazZMHTdBVEC8z6sL-1200-80.png.webp

VLDhEyRYXafGcf3RLKs2xT-1200-80.png.webp

And after 2 years of driver improvements it isn't as close anymore.
Yep, i recently picked up a refurbished Arc A770 16gb for 210 taxes, shipping, and a 2 year warranty included. Other than the need to definitely have re-sizable BAR and above 4GB encoding enabled, its been a decent card (some games were a stutter fest without those). I still cant recommend Arc to brand new first time builders because you definitely need to have a board that supports those features, and you need to know to enable them. But for experienced builders, yeah they're a very viable option now, and very well priced. The arc control center is still buggy, but hey at least it generally loads now.
 
I still have two lingering issues with Arc.

HDCP at higher resolutions causes the screen to go completely black at every screen change (ie tab, re-size, etc), and briefly at lower resolutions at beginning of content. So any time a copyrighted advertisement or content is playing in a browser, it can do that.

Failing to re-connect to audio over HDMI after the display has been off. This used to be a constant problem, now only happens every once in a while. Still means going into the device manager to kill the display audio driver and restart it.

I can't fault the gaming performance of even an A380 at relatively modern games. Still struggles with older DX titles, lots of stuttering.

You are tempting me to pick up an A770 myself. Kind of want to see if I can get an Intel edition one, just to have.
 
I still have two lingering issues with Arc.

HDCP at higher resolutions causes the screen to go completely black at every screen change (ie tab, re-size, etc), and briefly at lower resolutions at beginning of content. So any time a copyrighted advertisement or content is playing in a browser, it can do that.

Failing to re-connect to audio over HDMI after the display has been off. This used to be a constant problem, now only happens every once in a while. Still means going into the device manager to kill the display audio driver and restart it.

I can't fault the gaming performance of even an A380 at relatively modern games. Still struggles with older DX titles, lots of stuttering.

You are tempting me to pick up an A770 myself. Kind of want to see if I can get an Intel edition one, just to have.

If you were interested, this is the one that i picked up, its an Acer Bifrost from Acers ebay refurbished store, it comes with an 9% off coupon and then i found a 10% off coupon on top of that. They refresh the stock pretty regularly it seems, so if it runs out, they'll have more soon.

Refurbished Acer Bifrost A770 16GB
 
Last edited:
  • Like
Reactions: rluker5
I still have two lingering issues with Arc.

HDCP at higher resolutions causes the screen to go completely black at every screen change (ie tab, re-size, etc), and briefly at lower resolutions at beginning of content. So any time a copyrighted advertisement or content is playing in a browser, it can do that.

Failing to re-connect to audio over HDMI after the display has been off. This used to be a constant problem, now only happens every once in a while. Still means going into the device manager to kill the display audio driver and restart it.

I can't fault the gaming performance of even an A380 at relatively modern games. Still struggles with older DX titles, lots of stuttering.

You are tempting me to pick up an A770 myself. Kind of want to see if I can get an Intel edition one, just to have.
Could you be more specific? I haven't seen that issue with my A750, but I apparently don't watch a lot of copyrighted content. I watch movies from those free streaming services and sometimes rent one on a whim, youtube, other website stuff. But dropped Netflix and Prime. maybe it is that my A750 is hooked up to a Samsung 4k tv?

Lately I've been messing with that Lossless Scaling with the triple refresh on my tv that does 120. Not a fan of the blurring you see around a 3rd person pc during fast motion, and can't quite set Riva refresh perfect to completely eliminate tearing, but not a lot of other artifacts. Looks better in first person. Also not as smooth as locked vsync, but seems smoother than variable refresh.

Really I'm saving my irresponsible spending for Battlemage. I hope a high end one still comes in a 2 slot card.
 
  • Like
Reactions: artk2219
Windows 10 still. HDMI to a 4K TV. If I run it at 4K, the black screen problem appears constantly whenever something with HDCP is playing. Fine if you just watch statically, but every time an advertisement plays or any time the aspect ratio changes, it will go black again. Minimizing, maximizing a video will also do it.

At 1080p it does far better, but will still black out on the likes of a youtube movie. Usually only once or twice and for seemingly less time (that might just be draw time with the scaler running a little faster at 1080p)

Been a while since I have tried a new driver version. That last few made it far worse, so I had to painstakingly re-install a slightly older version. Arc driver install still leaves a lot to be desired. Generally I have had to self extract the files to get anything to work. Letting it download from the Arc control center seems like a failing proposition. Just wastes time downloading, fails silently.

My laptop also has Iris Xe, it suffers from similar black screen problems when using the integrated chip. Luckily it has the lowest tier Nvidia card, and I just run my browsers through it and it solves the problem.
 
  • Like
Reactions: artk2219
I can't replicate it on my A750. I tried that new Fallout on Amazon at highest quality, 4k on Youtube and watched John Wick 4 two weeks ago as a 4k rental and big and smalled it a couple of times. Also just tried alt tabbing in and out, also no display issues.
I use Edge btw so maybe that makes a difference? Also I have no issues using the integrated graphics on my 13600k doing the same thing.
But I'm on new W11 with no modifications. I have set a couple custom resolutions with Intel Graphics Command Center Beta from the Windows store but that is about it.

Also the first I've heard of Intel having video display issues. They've always been rock solid for me. Even my old W10 32 bit 2ghz ram 2w atom tablet still works great for video, but that's about it, slow as a rock with everything else. Well I do have an issue going from 4k60 to 1440p120 that the tv sometimes needs to be told to use high bandwidth so I can run 120hz but I think that is on the TV because it happened with AMD as well.
 
  • Like
Reactions: artk2219
this was better then the comparison between the 7900gre and the 4070 that they did recently. 7900gre beat it badly in pretty much every title and they gave 3 other categories to the 4070 and declared it a winner. cost the same money and performed 20-40% less in gaming, and it was the "winner"
Whoever said 7900gre does not destroy the 4070 is The most untruthful and dishonest journalist on the internet

Pretty much the same can be said about 6600 vs 3050, 30% is a whole tier above. Even the 3060 won't heat the 6600, TH is just a Nvidia sponsored vendors website and that's it
 
Last edited:
this was better then the comparison between the 7900gre and the 4070 that they did recently. 7900gre beat it badly in pretty much every title and they gave 3 other categories to the 4070 and declared it a winner. cost the same money and performed 20-40% less in gaming, and it was the "winner"
You must have read a different article than I did.

https://www.tomshardware.com/pc-com...eoff-which-mainstream-graphics-card-is-better

Performance Winner: Tie

Overall, it's a close battle between these two GPUs. The gaming performance gap is less than 5% even at 4K — it's generally less at 1440p and 1080p. If you only care about native rasterization performance, AMD gets a clear win here, but ray tracing in games continues to become more prevalent and Nvidia has strong AI performance as well. We also have to consider upscaling, where DLSS is more widely supported and delivers better image quality than FSR 2/3. Factoring all of those things, we can only conclude that across a wide selection of games, both GPUs deliver a similar experience and are a great choice for 1440p gaming.

Price Winner: Tie

Considering this is a highly contested market segment, it's not surprising that both AMD and Nvidia deliver a similar overall value and experience. There's been some fluctuation over time, with official and unofficial price cuts giving one or the other company a short-lived advantage, but it's difficult to argue against either card.
 
The 6600 seems to be a clear winner against the 3050
In this price range, raw performance is far more important than anything but price.
You can't really use ray tracing on either card and neither DLSS or FSR is great at 1080p

I am curious to see if an Intel card would soundly beat both of these cards or if there are still too many driver issues.
 
seems like for 200$, you get a best bang for the buck in the used market.

a used 5700XT is still my recommended choice at 150 to 200$ CAD
it's sad that the price to performance ratio is still the same since the last 5 years
 
  • Like
Reactions: artk2219
The arc issue with sound maybe come from the audio system it self. Here I use the igpu and discrete graphics same time. Intel primary and nvidia secondary gpu.
I have a creative Sound card on system...
When I enable on bios the realtek audio, the display port and hdmi have digital audio thought hdmi and display port. Disabling the realtek audio disables de digital audio from ports.

Maybe with amd motherboard arc gpus don't work really well because how intel integrated de audio system direct in chipset itself or even cpu :)

igpu more gpu saves. 10w of power compsumation on idle. On load gpu scenarios add 3w of power :S
 
  • Like
Reactions: artk2219
I can't replicate it on my A750. I tried that new Fallout on Amazon at highest quality, 4k on Youtube and watched John Wick 4 two weeks ago as a 4k rental and big and smalled it a couple of times. Also just tried alt tabbing in and out, also no display issues.
I use Edge btw so maybe that makes a difference? Also I have no issues using the integrated graphics on my 13600k doing the same thing.
But I'm on new W11 with no modifications. I have set a couple custom resolutions with Intel Graphics Command Center Beta from the Windows store but that is about it.

Also the first I've heard of Intel having video display issues. They've always been rock solid for me. Even my old W10 32 bit 2ghz ram 2w atom tablet still works great for video, but that's about it, slow as a rock with everything else. Well I do have an issue going from 4k60 to 1440p120 that the tv sometimes needs to be told to use high bandwidth so I can run 120hz but I think that is on the TV because it happened with AMD as well.

I actually suspect many lingering issues are a Windows 10 problem, it also started life as a Windows 7 install. At this point it would be silly of them not to focus on Windows 11 compatibility.

Edge and Chrome should be functionally identical when it comes to video playback. And I certainly experienced it on my laptop with Edge and Chrome. i7-11657G with the earliest Xe cores.
 
Seems to me that TG is becoming less and less of a real hardware site, the quality of articles is questionable, news stories tends to be more about controversial click bait headlines than quality reporting... pity.