RTX 3050 vs Arc A750 GPU faceoff — Intel Alchemist goes head to head with Nvidia's budget Ampere

Admin

Administrator
Staff member

Neilbob

Distinguished
Mar 31, 2014
262
348
19,720
Oh my goodness, you did it again. What is with this need to place 3050 pricing on the same level as the A750 (and RX 6600) even though the performance between the two is virtually night and day?

And this time you tried to justify it by suggesting the dissipating 3050 inventory would result in higher prices, so buy it now at the (still utterly atrocious) price of $200. That is not a good enough reason.

This really annoys me. Even with its various issues, the A750 is such a better budget purchase. I don't get why the 3050 isn't getting dolloped on from a huge height, because it really does deserve it.
 
  • Like
Reactions: artk2219

Pierce2623

Prominent
Dec 3, 2023
485
368
560
Oh my goodness, you did it again. What is with this need to place 3050 pricing on the same level as the A750 (and RX 6600) even though the performance between the two is virtually night and day?

And this time you tried to justify it by suggesting the dissipating 3050 inventory would result in higher prices, so buy it now at the (still utterly atrocious) price of $200. That is not a good enough reason.

This really annoys me. Even with its various issues, the A750 is such a better budget purchase. I don't get why the 3050 isn't getting dolloped on from a huge height, because it really does deserve it.
Realistically, against a 50 tier card, the Intel had a pretty horrible showing and even lost in some games, with a 225w TDP no less. I’m far from impressed. One last point, I thought one of the main selling points of Intel was supposedly even better ray tracing performance than Nvidia compared to overall performance but it looks like almost every Nvidia victory was in ray traced games.
 
Last edited:
Oh my goodness, you did it again. What is with this need to place 3050 pricing on the same level as the A750 (and RX 6600) even though the performance between the two is virtually night and day?

And this time you tried to justify it by suggesting the dissipating 3050 inventory would result in higher prices, so buy it now at the (still utterly atrocious) price of $200. That is not a good enough reason.

This really annoys me. Even with its various issues, the A750 is such a better budget purchase. I don't get why the 3050 isn't getting dolloped on from a huge height, because it really does deserve it.
Have you tested and used an Intel Arc GPU? Because I have, repeatedly, for over 18 months. I still encounter regular quirks and issues. Both cards cost around $200, and even then, while Intel wins big on performance in a lot of cases, I wouldn't recommend an Arc GPU to less experienced PC builders. It's a serious case of Dr. Jekyll and Mr. Hyde, where some things are great and others are terrible.

Arc A750 competes in performance with the RTX 3060 12GB. As you can hopefully guess, that's a very generous comparison if you wanted to pretend Intel is just as fast. For one, it's the rare case where Nvidia gives you more VRAM. But the overall performance ends up, at best, roughly tied... and the quirks and issues with Arc become so much more of a factor in that case.

Wholloping on the RTX 3050 8GB in performance, most of the time, is enough to make the A750 look pretty good. It uses a lot more power, the drivers and features aren't as good, but this is what Intel needs to do to "win" a matchup: Come out way ahead on performance. Matching (or close to matching) ends up as a clear loss because of everything else — and that applies whether looking at AMD or Nvidia GPUs.

Intel still needs to improve its ecosystem of software and drivers. The only way to do that is with lots of time. It's still a big question mark on how much better (in the software/drivers area) Battlemage will end up being relative to Alchemist. Five years from now, assuming Intel dedicated GPUs stick around, we'll hopefully be able to not worry about that aspect anymore, but in the here and now? The only way to recommend Arc GPUs is with some clear qualifications and disclaimers so that people know what they're getting into.
 

Neilbob

Distinguished
Mar 31, 2014
262
348
19,720
Realistically, against a 50 tier card, the Intel had a pretty horrible showing and even lost in some games, with a 225w TDP no less. I’m far from impressed.

I wouldn't recommend an Arc GPU to less experienced PC builders. It's a serious case of Dr. Jekyll and Mr. Hyde, where some things are great and others are terrible.

and other stuff

Okay, I'll concede some of that, especially the power consumption - that is quite significant over the medium-term. Even so, I'd still place the A750 slightly above... with the assumption that Intel aren't suddenly going to abandon fixing and fine-tuning and things will continue to improve. That said, I'd agree that inexperienced users should avoid.

Perhaps I am a little blinded by my hatred of the 3050. I really do detest that card, and personally go out of my way to steer people away from it where possible. If it dropped to somewhere between 125-150, then I'd change my tune. But that seems unlikely at this point.
 
  • Like
Reactions: artk2219

jlake3

Distinguished
Jul 9, 2014
137
201
18,960
I bought an A380 close to launch just to see what Arc was about, and while the drivers are much improved, that doesn’t mean they’re good.

After trying a couple different drivers on one of my test systems (and running DDU between each “upgrade”), now the Intel installer just won’t work? I’ve gotta extract the exe file and then point to the folder in device manager.

Same system had an issue with one of the drivers back in the 4xxx range that would hang the system ~30 seconds after boot every time. Tried redownloading and even redownloading on another PC and sneakernetting it over, and that version would bork that PC every time.

Building a system from used parts for someone right now, and thought “Let’s throw my A380 in and see what it does on this hardware” and whoops, Cyberpunk won’t launch on .5333. It launches on .5333 in my test bench, but 100% crash rate in that system unless I upgrade to .57-something. No other card or driver I tried straight failed, and that included a card each from the GTX 600 and HD 7000 families.

So I think the final 3-3 scoring is pretty spot on. In a current system running the latest/biggest DX12 titles Intel wins performance, both have effectively equal street prices at time of publication, but Intel’s drivers are still rough and as the article points out, the architecture isn’t where you’d expect it to be by die size or power consumption.
 

Gururu

Prominent
Jan 4, 2024
302
202
570
I visited this article a few times and read it through. As someone who has been holding out on a new GPU (still using integrated) it is very helpful. Yes I have seriously considered the Arcs and the prices are making them attractive. Very disappointed that everything new looks like 2025. I just need something that can do WOW, elden ring and baldurs gate 3 at high settings 1440. Its been very challenging to make sense if any of these $200 offerings can do.
 
  • Like
Reactions: artk2219

Nyara

Prominent
May 10, 2023
69
60
610
I visited this article a few times and read it through. As someone who has been holding out on a new GPU (still using integrated) it is very helpful. Yes I have seriously considered the Arcs and the prices are making them attractive. Very disappointed that everything new looks like 2025. I just need something that can do WOW, elden ring and baldurs gate 3 at high settings 1440. Its been very challenging to make sense if any of these $200 offerings can do.
Just buy the RX 6600, it leapfrogs the RTX 3050 in performance/game compatibility/power performance, while having no driver issues.

However for 1440p I would recommend a RX 7700 XT ($350) instead.
 
Last edited:

Eximo

Titan
Ambassador
I bought an A380 close to launch just to see what Arc was about, and while the drivers are much improved, that doesn’t mean they’re good.

After trying a couple different drivers on one of my test systems (and running DDU between each “upgrade”), now the Intel installer just won’t work? I’ve gotta extract the exe file and then point to the folder in device manager.

I did the same, to keep an eye on the drivers.

I did have a lot of trouble with the last few driver updates, had to extract and point at the folder as well. The most recent install I did worked, though they need to work on the process. The install is too silent, I thought it had messed up again until the screen went black and it prompted for a restart.

Most common issue I still encounter is the failure of the HDMI audio to recover from the screen being off. Have to go into device manager and disable/enable it to get it back. Seems to do it about 1/10 ten times.
 
  • Like
Reactions: artk2219
I bought an A380 close to launch just to see what Arc was about, and while the drivers are much improved, that doesn’t mean they’re good.

After trying a couple different drivers on one of my test systems (and running DDU between each “upgrade”), now the Intel installer just won’t work? I’ve gotta extract the exe file and then point to the folder in device manager.

Same system had an issue with one of the drivers back in the 4xxx range that would hang the system ~30 seconds after boot every time. Tried redownloading and even redownloading on another PC and sneakernetting it over, and that version would bork that PC every time.

Building a system from used parts for someone right now, and thought “Let’s throw my A380 in and see what it does on this hardware” and whoops, Cyberpunk won’t launch on .5333. It launches on .5333 in my test bench, but 100% crash rate in that system unless I upgrade to .57-something. No other card or driver I tried straight failed, and that included a card each from the GTX 600 and HD 7000 families.

So I think the final 3-3 scoring is pretty spot on. In a current system running the latest/biggest DX12 titles Intel wins performance, both have effectively equal street prices at time of publication, but Intel’s drivers are still rough and as the article points out, the architecture isn’t where you’d expect it to be by die size or power consumption.
I will say, out of all the Arc GPUs, the A380 is the worst I've used. (I haven't used A310, and I'm not including integrated Arc in Meteor Lake.) In theory, it should basically be something like 1/4 of an A770, but with a bit better bandwidth per GPU core. In practice, it feels like Intel has spent way more time trying to get A750 to work well, and sometimes the 'fixes' there don't fully trickle down to A380 — meaning that it feels like testing of bugs and optimizations gets done on A750, maybe not always on A380.

For video stuff, it's fine. For gaming, on paper it should be way faster than it is in practice. Like, it's still slower than RX 6500 XT in everything. A380 has 4.2 teraflops FP32, 6GB VRAM, 186 GB/s bandwidth, and it can't beat AMD's card that has 5.8 teraflops (lots more compute) but only 4GB VRAM and 144 GB/s bandwidth. Also, 107 mm^2 die for AMD vs 157 mm^2 for Intel. Oops.
 

Pierce2623

Prominent
Dec 3, 2023
485
368
560
Okay, I'll concede some of that, especially the power consumption - that is quite significant over the medium-term. Even so, I'd still place the A750 slightly above... with the assumption that Intel aren't suddenly going to abandon fixing and fine-tuning and things will continue to improve. That said, I'd agree that inexperienced users should avoid.

Perhaps I am a little blinded by my hatred of the 3050. I really do detest that card, and personally go out of my way to steer people away from it where possible. If it dropped to somewhere between 125-150, then I'd change my tune. But that seems unlikely at this point.
Oh I agree on the 3050 pricing. The crazy thing is they can drop the tdp from 120w to 70, cut VRAM to 6GB on a 96 bit bus and even deactivate a few hundred shaders and it only loses 15-20 % performance. In fact the 6GB model is so close on performance that it just makes the 8GB model look all the worse.
 
  • Like
Reactions: artk2219

Pierce2623

Prominent
Dec 3, 2023
485
368
560
I visited this article a few times and read it through. As someone who has been holding out on a new GPU (still using integrated) it is very helpful. Yes I have seriously considered the Arcs and the prices are making them attractive. Very disappointed that everything new looks like 2025. I just need something that can do WOW, elden ring and baldurs gate 3 at high settings 1440. It’s been very challenging to make sense if any of these $200 offerings can do.
The minimum I’d buy for a solid 60fps in Elden Ring at 1440p would be in the 3060ti tier. A cheap 6700xt/6750xt was the best value for a long time around $300 but I’m unsure if they’re still available that cheap. A $250 rx7600 or a $225 6650xt should also get you there
 
  • Like
Reactions: Gururu and artk2219
The minimum I’d buy for a solid 60fps in Elden Ring at 1440p would be in the 3060ti tier. A cheap 6700xt/6750xt was the best value for a long time around $300 but I’m unsure if they’re still available that cheap. A $250 rx7600 or a $225 6650xt should also get you there
I'd say the RX 6750 XT still represents your best bang for the buck overall, without concerns about drivers and software. There's one at Amazon (and Newegg) for $299.99, but supplies of Navi 22 do seem to be drying up and may not be around too much longer. Looking at Newegg, only two models (other than Peladn, which I don't think I'd trust) cost under $325, and then there's a big jump to $399. Of course, at $399 you should just get a 7700 XT instead.

Here's the updated "bang for the buck" table, post-Prime Day pricing (though a few sales may still be around). You can see that, outside of the A580 and A770, the 6750 XT takes top honors in terms of overall value.

Graphics CardPriceFPS/$FPS/WEff Rank1080p1440p4KPower
Intel Arc A580$1600.3070.2554249.134.6193
Intel Arc A750$2000.2660.2654153.238.419.2201
Radeon RX 6750 XT$3000.2330.2743869.848.324.9255
Radeon RX 6650 XT$2130.2320.2913549.230.713.4169
Radeon RX 6800$3500.2310.3531880.957.930.8229
GeForce RTX 4060$2900.2280.523766.044.522.2126
Radeon RX 6600$1900.2260.3093142.927.811.9139
GeForce RTX 4060 Ti$3700.2210.577381.655.228.1142
Radeon RX 7600$2500.2210.3521955.135.914.8157
Radeon RX 6700 XT$3000.2180.3033365.545.123.4216
Intel Arc A770 16GB$2700.2180.2783758.843.5211
Radeon RX 7700 XT$3900.2150.3551783.760.431.5236
Radeon RX 6600 XT$2350.2100.3232249.332.2153
Radeon RX 6800 XT$4500.2040.3103091.767.135.8296
GeForce RTX 3050$1950.2040.3182639.727.113.8125
Radeon RX 7800 XT$4710.2020.3831295.270.237.9249
GeForce RTX 4070 Super$5800.2010.5781116.884.546.1202
GeForce RTX 3060$2780.1960.3402054.437.920.1160
Radeon RX 7900 GRE$5300.1950.39011103.176.740.8264
Radeon RX 6700 10GB$3000.1940.3073258.138.918.7189
Radeon RX 6900 XT$5000.1940.3222396.871.038.2301
Intel Arc A380$1000.1900.2724019.070
Radeon RX 6950 XT$5300.1900.31129100.574.240.2323
Radeon RX 7600 XT$3200.1880.3172760.140.319.9190
GeForce RTX 4070$5450.1870.5475101.973.538.9186
GeForce RTX 4060 Ti 16GB$4400.1850.540681.456.129.2151
GeForce RTX 3080$5600.1830.32024102.574.840.5320
GeForce RTX 3070$4530.1780.3721580.856.228.9217
GeForce RTX 3060 Ti$4050.1780.3561672.149.925.3202
Radeon RX 7900 XT$6900.1700.38213117.589.249.3308
GeForce RTX 4070 Ti$7300.1700.5208124.190.650.4239
GeForce RTX 4070 Ti Super$7800.1670.5069130.697.855.1258
GeForce RTX 4080 Super$9590.1530.5772146.3112.765.4254
GeForce RTX 3070 Ti$5810.1470.3112885.660.031.3275
Radeon RX 7900 XTX$9100.1420.37214128.8100.558.1346
GeForce RTX 4080$1,1000.1300.5574143.5110.063.6257
GeForce RTX 3080 Ti$8990.1270.31925113.783.647.1357
GeForce RTX 3090$9300.1250.33021116.385.748.9353
GeForce RTX 3080 12GB$9310.1190.29434111.181.545.9377
GeForce RTX 4090$1,7000.0980.49610166.7139.287.4336
GeForce RTX 3090 Ti$1,4400.0860.28936124.392.553.3431
 

Nyara

Prominent
May 10, 2023
69
60
610
Oh I agree on the 3050 pricing. The crazy thing is they can drop the tdp from 120w to 70, cut VRAM to 6GB on a 96 bit bus and even deactivate a few hundred shaders and it only loses 15-20 % performance. In fact the 6GB model is so close on performance that it just makes the 8GB model look all the worse.
This sounds good until you realize the 8GB one already performed awful for its budget, so doing 20% worse from bad is terrible. Also 1% FPS can fall to 40% worse at times, and the bus is insufficient to make some modern games boot at all.

I would not recommend any card (except for display/light work) below the RX 6600 for $190 since they are not spec properly for modern gaming, and you can get superior compatibility with integrated graphics from Intel's Lunar Lake Battlemage soon or AMD's 780M RDNA3 or soon 880M/890M RDNA3.5, or even buy a Strix Halo next year with theorical RX 6700 XT performance.

If you really need to beggar heavy on price, a random label RX 580 8GB for $85 outperforms the RTX 3050 6GB.
 
Last edited:
  • Like
Reactions: artk2219
"Arguably, the biggest strength of the 3050 is the fact that it's an Nvidia GPU. With Nvidia hardware, you generally know you're getting a lot of extras outside of raw performance. Nvidia cards have more features (e.g. DLSS, Reflex, Broadcast, VSR) and arguably the best drivers of any GPU."

While I agree with the general sentiment that this should be a tie given that Arc is still a bit buggy, and Nvidia does offer some good feature bonus's. The way that statement above is worded does seem to visibly skew the bias in the review, and I would agree that Nvidia's track record on drivers is indeed arguable. They've had more than their fair share of BSOD causing driver releases, not to mention releases that have bricked GPU's. Thankfully we haven't had many problem releases like that in the last few years. While you can talk about CUDA being usable on an RTX 3050, its still a walled garden that you cant escape if you go down that path. Ray tracing is a non starter at this performance level so im just going to leave that alone. But in this price range, unless you need one of the specifically mentioned features, the pecking order really should be the RX 6600, followed by a tie of the A750 and RTX 3050. The A750 would be a recommendation for the more tech savvy among us that are willing to put up with some quirks, and the RTX 3050 for anyone that is absolutely tied into Nvidia's ecosystem. For everyone else, if they can, than they should either save up a bit more for a better card, or go with the RX 6600.
 
Last edited:
  • Like
Reactions: Nyara

Pierce2623

Prominent
Dec 3, 2023
485
368
560
This sounds good until you realize the 8GB one already performed awful for its budget, so doing 20% worse from bad is terrible. Also 1% FPS can fall to 40% worse at times, and the bus is insufficient to make some modern games boot at all.

I would not recommend any card (except for display/light work) below the RX 6600 for $190 since they are not spec properly for modern gaming, and you can get superior compatibility with integrated graphics from Intel's Lunar Lake Battlemage soon or AMD's 780M RDNA3 or soon 880M/890M RDNA3.5, or even buy a Strix Halo next year with theorical RX 6700 XT performance.

If you really need to beggar heavy on price, a random label RX 580 8GB for $85 outperforms the RTX 3050 6GB.
I don’t think you understand the appeal of the the 3050 6GB. It’s the most powerful low profile consumer card that runs only on pcie power by miles. I don’t think anyone would buy it for a “normal” gaming system. You compare it to an rx580 but it actually outperforms the rx580 significantly without even needing a power connector. The bit about the 3050 6GB not having enough bandwidth to boot some games is complete fabrication. It has nearly double the bandwidth of the iGPUs you mentioned.
 
  • Like
Reactions: artk2219

Nyara

Prominent
May 10, 2023
69
60
610
I don’t think you understand the appeal of the the 3050 6GB. It’s the most powerful low profile consumer card that runs only on pcie power by miles.
Obviously, it is literally the only released card with those aims that isn't from the past decade. With small workarounds, you can also use a low profile RX 6600 without PCIe, and at the power limit, it still outperforms it by over 30%, but almost nobody does because...

This is a niche that is no longer relevant. Ultra-small ITX (the only time when you would care about skipping a power connector) are getting replaced by upgradeable mini-PCs, at a fraction of the price. And from a power performance point of view, a mobile card makes way more sense, or integrated graphics. A RTX 4060 Mobile almost doubles the performance of the RTX 3060 6GB, and the RX 7600M isn't far, while the 780M matches the performance of the RTX 3060 6GB while using 35W for the whole system.

The RX 580 8GB definitively gives it a competition (and at just a marginally higher 100W TDP from most sellers now), it falls in some modern games (which the RTX 3050 6GB cannot run at all unless going 540p), but it will run all older games and most applications just as good or better at times than the RTX 3050 6GB, while costing nothingburger and saving the environment from some e-waste. Unsold RTX 3050 6GB will just be dismantled to become Switch 2s next year.

The only reason I would advise somebody to buy a RTX 3050 (6GB or 8GB) is because they need CUDA for work, with low performance requirements and/or low budget, or I guess if you want to hold in a niche ITX case that will break and destroy itself unless you avoid using a power connector, and you do not want to deal with tweaking a RX 6600 for the job.
 
Last edited:
  • Like
Reactions: artk2219

Pierce2623

Prominent
Dec 3, 2023
485
368
560
Obviously, it is literally the only released card with those aims that isn't from the past decade. With small workarounds, you can also use a low profile RX 6600 without PCIe, and at the power limit, it still outperforms it by over 30%, but almost nobody does because...

This is a niche that is no longer relevant. Ultra-small ITX (the only time when you would care about skipping a power connector) are getting replaced by upgradeable mini-PCs, at a fraction of the price. And from a power performance point of view, a mobile card makes way more sense, or integrated graphics. A RTX 4060 Mobile almost doubles the performance of the RTX 3060 6GB, and the RX 7600M isn't far, while the 780M matches the performance of the RTX 3060 6GB while using 35W for the whole system.

The RX 580 8GB definitively gives it a competition (and at just a marginally higher 100W TDP from most sellers now), it falls in some modern games (which the RTX 3050 6GB cannot run at all unless going 540p), but it will run all older games and most applications just as good or better at times than the RTX 3050 6GB, while costing nothingburger and saving the environment from some e-waste. Unsold RTX 3050 6GB will just be dismantled to become Switch 2s next year.

The only reason I would advise somebody to buy a RTX 3050 (6GB or 8GB) is because they need CUDA for work, with low performance requirements and/or low budget, or I guess if you want to hold in a niche ITX case that will break and destroy itself unless you avoid using a power connector, and you do not want to deal with tweaking a RX 6600 for the job.
Actually there are newer mini-PCs that take pcie only GPUs. I also flip old office PCs with small pcie-only GPUs added in to local soccer moms wanting a cheap 1080p Valorant or Fortnite machine for their little turd. Also, would you care to point me to the low profile rx6600 cards?
 
  • Like
Reactions: artk2219