$200 GPU face-off: Nvidia RTX 3050, AMD RX 6600, and Intel Arc A750 duke it out at the bottom of the barrel

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
If you can’t spend more than around $200 on a discrete GPU, you have three choices these days. We dug in to see if any of them are worth your hard-earned cash.

$200 GPU face-off: Nvidia RTX 3050, AMD RX 6600, and Intel Arc A750 duke it out at the bottom of the barrel : Read more
Better to go used. You can get an RTX 2070 8 GB on ebay for around $175, or a RTX 3060 12GB for around $230. Both of these perform much better than the new cards reviewed.

Buy from someone who allows returns if the used card doesn't work. I have bought numerous used GPUs from ebay, and they all worked.
 
Here in brazil the king of graphics card is the RX 580 and Geforce 1060.
The Geforce RTX 3060 12GB still selling very well.
Intel graphics don't comes to this market
AMD graphics still premium

The avarage builds are a old xeon 2650v3 or V4 paired with a RX 580
 
As per the comparison, why would you prefer a slower card? I'd recommend the RX 6600 if buying new, any time of the day. Low prifle, sure, pick the 3050. But for everything else, pick the Radeon. Budget gamers need first and foremost price per raster performance, and the Geforce just doesn't have it (the Arc does, but 200W may be too much for budget PSUs).
As per my comment, if you can find a single slot, low profile Radeon 6600XT... I will eat my hat.

Some of the best super-budget gaming rigs are the old workstations from Dell/HP that you grab from Ebay which you just drop in a cheap GPU into, unfortunately they only take half height cards more often than not... The 6600XT just won't drop into those rigs.
 
As per the comparison, why would you prefer a slower card? I'd recommend the RX 6600 if buying new, any time of the day. Low prifle, sure, pick the 3050. But for everything else, pick the Radeon. Budget gamers need first and foremost price per raster performance, and the Geforce just doesn't have it (the Arc does, but 200W may be too much for budget PSUs).
6600 doesn't even win at native vs the A750. Why recommend something that is worse at everything but power draw? Also you would need a sub 300w power supply for the Arc to be a problem. Do they make those?

Edit: 9000 series has good raytracing and AI upscaling, UDNA is doubling down on this. The 6600 may as well have 4GB vram at this point because, unlike the A750 and the 3050, it's performance in new games is going to be kneecapped by the mandatory raytracing and AI upscaling. It was good for it's time but AMD has decreed that its time has passed.
 
Last edited:
6600 doesn't even win at native vs the A750. Why recommend something that is worse at everything but power draw? Also you would need a sub 300w power supply for the Arc to be a problem. Do they make those?
Intel’s Arc A750 needs a whopping 225 W to deliver its strong performance in gaming, or nearly 100W more than the RX 6600. That’s 70% more power for just 6% higher performance at 1080p, on average. Worse, Intel’s card also draws much more power at idle than either the RX 6600 or A750 without tweaking BIOS and Windows settings to mitigate that behavior.

That kind of extra power draw for that minuscule performance benefit is nothing to just brush off as meaningless.
 
  • Like
Reactions: adbatista
That kind of extra power draw for that minuscule performance benefit is nothing to just brush off as meaningless.
I can see why you think the A750 runs at 225w. It is what Techpowerup says, but it isn't true. The default TDP of the A750 limited edition (reference model) is 190w.
Also Arc's TDP is not comparable to AMDs or Nvidia's where they are limited by it all of the time. With Arc, at least the first 2 gens it is a worst case scenario and you will only see that power draw in synthetic benchmarks like Timespy.

Arc software has a TDP slider that you can slide down to 95w in the case of the A750 and at 130w the performance drop in games isn't that much and is worse in synthetics.
Is Timespy at default settings my A750 got 78.77 and 71.36 fps in graphics tests 1 and 2, respectively. At 130 watts the fps dropped to 66.13 and 59.72. This is the worst case scenario though. In CP2077, 1440p ultra, all game chosen settings, including game selected XeSS quality, 190w in game bench fps =66.52 and 130w= 58.40.
With Doom Dark Ages, 1080p, ultra nightmare settings, XeSS performance, in 1st canned bench choice Hebeth, 190w =60.06 fps and 130w = 57.95 fps. All fps are given in averages. Doom runs at about 145w at stock settings at uncapped framerates and stock GPU settings btw.

Not like you would have any way to know this because of the limited coverage of the A750, but it isn't that far off of the RX6600 in gaming efficiency.

Idle efficiency is bad and a bit of a hassle to fix, but many are fine with a worse idle efficiency deficit with all Ryzen desktop CPUs and I hear those are still pretty popular.
 
I can see why you think the A750 runs at 225w. It is what Techpowerup says, but it isn't true. The default TDP of the A750 limited edition (reference model) is 190w.
Also Arc's TDP is not comparable to AMDs or Nvidia's where they are limited by it all of the time. With Arc, at least the first 2 gens it is a worst case scenario and you will only see that power draw in synthetic benchmarks like Timespy.

Arc software has a TDP slider that you can slide down to 95w in the case of the A750 and at 130w the performance drop in games isn't that much and is worse in synthetics.
Is Timespy at default settings my A750 got 78.77 and 71.36 fps in graphics tests 1 and 2, respectively. At 130 watts the fps dropped to 66.13 and 59.72. This is the worst case scenario though. In CP2077, 1440p ultra, all game chosen settings, including game selected XeSS quality, 190w in game bench fps =66.52 and 130w= 58.40.
With Doom Dark Ages, 1080p, ultra nightmare settings, XeSS performance, in 1st canned bench choice Hebeth, 190w =60.06 fps and 130w = 57.95 fps. All fps are given in averages. Doom runs at about 145w at stock settings at uncapped framerates and stock GPU settings btw.

Not like you would have any way to know this because of the limited coverage of the A750, but it isn't that far off of the RX6600 in gaming efficiency.

Idle efficiency is bad and a bit of a hassle to fix, but many are fine with a worse idle efficiency deficit with all Ryzen desktop CPUs and I hear those are still pretty popular.
Two questions, then:
  1. Where is this information coming from about Arc's TDP (or TBP?) being different than Nvidia's or AMD's?
  2. Where is that 190W number coming from? I just did a brief search, and Intel specifically states 225W (though the Arc 750 product page I found didn't list specs at all)
In fact, if you hit the little question mark icon, it brings up a modal window specifically stating:
Total Board Power (TBP) represents the total power draw of a graphics card or other add-in card in watts, when it is operating under a typical load such as a gaming workload. The TBP value corresponds to Intel’s reference design. Intel’s partners may choose to productize Intel-based solutions with higher TBP values.
(emphasis mine)
 
Two questions, then:
  1. Where is this information coming from about Arc's TDP (or TBP?) being different than Nvidia's or AMD's?
  2. Where is that 190W number coming from? I just did a brief search, and Intel specifically states 225W (though the Arc 750 product page I found didn't list specs at all)
In fact, if you hit the little question mark icon, it brings up a modal window specifically stating:

(emphasis mine)
1. You misread what I wrote on this one. I have cards from all 3 and I see how they behave. In most games Nvidia and AMD cards will max out the power limit when the cards are fully utilized. AMD cards generally get better performance when undervolted and Nvidia cards when undervolted by adjusting the volt curve for this reason. This is well established. Intel cards rarely are power limited and get better performance when overvolted. Overvolting is so alien to AMD and Nvidia GPU users that you must have thought I meant something different. I did not say that the TDP is different as being some different measurement, I meant that it was used in application differently. Intel's TDP is set way too high for most uses. Some synthetic benchmarks use it though.
2. The 190w number came from TDP, just as I stated. It is listed in my Arc oc software and GPUz under current and default power limit but refers just to the GPU chip. I just assumed that others were using TDP as well, which was in error. There were also a number of stories 3 years back stating that AMD was misrepresenting their numbers on this particular generation, like this one: https://www.igorslab.de/en/graphics...with-nvidia-and-nearly-impossible-with-amd/5/ which may or may not have been correct.

But, since it seems I may have been incorrect in my initial assessment, and since it is an easy problem to fix: (225w-190w=35w and 130w-35w=95w) The 35w is conservative since the power delivery and cooling fans will use less power with the less power the chip uses, but I can still go with 35w less than 130w used by the 6600 and limit the core of my A750 to 95w since the slider in the software goes that low. That way the A750 core of 95w + the rest of the stuff at less than 35w should be less than the stated 6600 132w TBP.

Timespy at 95wTDP gave a graphics score of 7858 which is a bit less than the Guru3d 6600 review of 8071, so I increased the performance boost slider in Arc oc panel and got 8497 graphics at 95w. (The fps were 56% faster with stock than 95w and 45% faster at stock than 95w oc so the most power hungry use suffered the most from a dropped power limit)

In CP2077, at 95w, 1440p ultra, canned bench the A750 got 46.75fps and with the oc it got 51.03fps at 95w. I don't know how many the 6600 gets at these settings, but I do know that the reference A750 gets 39% more fps than the 6600 at 1440p here: https://www.techpowerup.com/review/intel-arc-a750/11.html and that the stock reference A750 gets 42% more fps than the 95w A750 and 30% more fps than the oc 95w A750 on my pc so the 6600 would fit in between those.

In Doom The Dark Ages, at 1080p ultra nightmare, Xess performance, 95w dropped the Hebeth bench down to 52.49fps and oc 95w dropped it to 55.90fps. The stock A750 is only 14.4% faster than the 95w and 7.5% faster than 95w oc. To compare the 95w TDP (130wTBP) 750 to the 6600 I will have to do a little extrapolating. In that previous link for CP2077 both the A770and A750 were shown and the A750 was 92.9% as fast as the A770 at a GPU, but not VRAM limited 1440p. Doom The Dark Ages shares both of these characteristics so I took the A770 value and multiplied it by .929 to compare it to the 6600 at 1080p: https://www.techpowerup.com/review/doom-the-dark-ages-performance-benchmark/5.html
The approximated stock A750 is 30.7% faster than the 6600 there.

These scenarios favor Intel, but new games will be harder to run than they were in 2022 and 1080p 2025-2027 games will likely treat GPUs like 1440p 2022 games. And I'm not trying to prove that the A750 is more efficient than the 6600 (although it is in these uses), just that it is not a huge, deal breaking difference. The continuing relative decline of the 6600, in addition to bad upscaling and pitiful raytracing performance should be when games are just going to use that stuff more and more in the future. Even AMD is pushing it.
 
Have a low spec older pc in my garage I've been playing around with the last few days. Has a low profile rx 550 2gb in it. Geforce now works great on it. I feel like if you're super broke that's an excellent route to take for a reasonably good gaming experience, provided your internet doesn't suck.