Review Intel Arc A750 Limited Edition Review: RTX 3050 Takedown

Status
Not open for further replies.
TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.

My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?

I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
 
TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.

My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?

I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
Intel will continue to do integrated graphics for sure. That means they'll still make drivers. But will they keep up with changes on the dGPU side if they pull out? Probably not.

I don't really think they're going to ax the GPU division, though. Intel needs high density compute, just like Nvidia needs its own CPU. There are big enterprise markets that Intel has been locked out of for years due to not having a proper solution. Larrabee was supposed to be that option, but when it morphed into Xeon Phi and then eventually got axed, Intel needed a different alternative. And x86 compatibility on something like a GPU (or Xeon Phi) is going to be more of a curse than a blessing.

I really do want Intel to stay in the GPU market. Having a third competitor will be good. Hopefully Battlemage rights many of the wrongs in Alchemist.
 

Giroro

Splendid
So what's the perf/$ chart look like without Ray Tracing results included?

I mean I love Control and everything, but I've been done with it for years. I googled "upcoming ray tracing games" and the top result was still that original list from 2019.
There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.

Also, I'm not really understanding how the hypothetical system cost that was discussed would be factored into the math.
 
Last edited:

InvalidError

Titan
Moderator
There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
Chicken-and-egg problem: game developers don't want to bother with RT because most people don't have RT-capable hardware, hardware designers limit emphasis on RT for cost-saving reasons since very little software will be using it in the foreseeable future.

As more affordable yet sufficiently powerful RT hardware becomes capable of pushing 60+FPS at FHD or higher resolutions, we'll see more games using.

It was the same story with pixel/vertex shaders and unified shaders. Took a while for software developers to migrate from hard-wired T&L to shaders, give it a few year and now fixed-function T&L hardware is deprecated.

Give it another 5-7 years and we'll likely get new APIs designed with RT as the primary render flow.
 

drajitsh

Distinguished
Sep 3, 2016
136
25
18,720
The Intel Arc A750 goes after the sub-$300 market with compelling performance and features, with a slightly trimmed down design compared to the A770. We've tested Intel's new value oriented wunderkind and found plenty to like.

Intel Arc A750 Limited Edition Review: RTX 3050 Takedown : Read more
@jaredwaltonGPU
Hi, I have some questions and a request
  1. Does this support PCIe 3.0x16.
  2. For Low end GPU could you select a low end GPU like my Ryzen 5700G. this would tell me 3 things -- support for AMD, Support for PCIe 3.0, and use for low end CPU
 
Sep 1, 2022
5
1
15
REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...
 
I don't need a card at the moment since I've got a 6700xt, but the new intel cards are interesting. If they stay around with them, I might consider a purchase of one on my next upgrade if they are decent to help a 3rd player stay in.
 
Oct 7, 2022
1
1
15
TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.

My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?

I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
Look at the synthetics other sites have done. These cards appear to be significantly more powerful than the gaming benchmarks suggest. AMD cards were like this for many years, they would come out and over the next 6mo to a year they would get their drivers working better. Suddenly they were competitive with their NVIDA counterpart. If you look at the hardware A770 should be competitive with 3070 and 6750. My guess is that in 6mo from now that will be the case. I’m happy to have new competition in the game and will buy an A770 simply for that reason. We all should celebrate. We should all be glad for new competition even if we are not going to buy one it will put negative pressure on GPU prices and the consumer wins. We also need to stop this nonsense talk that Intel is going to pull out. That idea makes no logical sense at this point. They have a decent product here, and have invested big money to make this happen.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
@jaredwaltonGPU
Hi, I have some questions and a request
  1. Does this support PCIe 3.0x16.
  2. For Low end GPU could you select a low end GPU like my Ryzen 5700G. this would tell me 3 things -- support for AMD, Support for PCIe 3.0, and use for low end CPU
Yes, the Arc cards are backward compatible with PCIe 3.0 and have an x16 link (A380 is PCIe 4.0 x8). But if you put the card in a PCIe 3.0 slot that doesn't have ReBAR, then you'd lose quite a lot of performance. The Ryzen 5700G should be fine, provided it's in a B550 / X570 motherboard that supports ReBAR. I have not yet tested A770 / A750 in an older PCIe 3.0 system, but it's on my todo list. Maybe I'll run those tests in the old testbed while running RTX 4090 tests in the new testbed this weekend! 🙃
 
  • Like
Reactions: drajitsh
What might be interesting is to test in a pc like mine with a b350 board, but I updated it to accept a 5900x, and the bios updates actually enabled smart access memory.
Which motherboard do you have? I know quite a few B450 boards got ReBAR support, but only a handful of B350/X370 AFAIK. I also suspect the benefits of ReBAR / SAM will be muted with older hardware, but it would be interesting to test. I probably won’t ever get around to it, as I don’t have all the CPUs and movies to do such testing, and it’s very time consuming. But I did plan on testing on an i9-9900K, which should be reasonably close to a 5900X running on a B350 board.
 
5900x
AsRock ab350 pro 4
32gb 4x8 gskill trident 3200mhz
MSI mech 6700xt
Samsung 960 evo 240gb
2 1tb sata ssds
1 2tb sata ssd
Cooler Master MB511 case
Corsair rmx 850 watt

The board I really can’t complain about. I got it back in about 2018, but it’s been a good board and the way it’s going may have a little edible left before it retires.
 
  • Like
Reactions: JarredWaltonGPU
5900x
AsRock ab350 pro 4
32gb 4x8 gskill trident 3200mhz
MSI mech 6700xt
Samsung 960 evo 240gb
2 1tb sata ssds
1 2tb sata ssd
Cooler Master MB511 case
Corsair rmx 850 watt

The board I really can’t complain about. I got it back in about 2018, but it’s been a good board and the way it’s going may have a little edible left before it retires.
I should have figured it was ASRock, they always seem to push the boundaries on what's "allowed" by both AMD and Intel. LOL.

Like I said, I doubt I'll have any opportunity to test Arc on that particular setup, but it's probably not going to be that far off what I'll get from testing on my Core i9-9900K setup. Both are PCIe 3.0, both are older motherboards, and after tuning the Core i9-9900K and Ryzen 9 5900X are usually pretty close in gaming performance. I don't know when I'll have the testing finished, what with the RTX 4090 prep happening right now, but sometime in the next couple of weeks for sure.
 
  • Like
Reactions: drajitsh

Co BIY

Splendid
Yes, the Arc cards are backward compatible with PCIe 3.0 and have an x16 link (A380 is PCIe 4.0 x8). But if you put the card in a PCIe 3.0 slot that doesn't have ReBAR, then you'd lose quite a lot of performance. The Ryzen 5700G should be fine, provided it's in a B550 / X570 motherboard that supports ReBAR. I have not yet tested A770 / A750 in an older PCIe 3.0 system, but it's on my todo list. Maybe I'll run those tests in the old testbed while running RTX 4090 tests in the new testbed this weekend! 🙃

It sounds like you'll be busy for a while in the GPU area but I think the Arc GPUs need a running article along the lines of - "Arc Drivers : What we know right now!"

I think the driver support will (and should) influence a lot of purchasing decisions. And I think there is a lot of reader interest in this topic.
 
It sounds like you'll be busy for a while in the GPU area but I think the Arc GPUs need a running article along the lines of - "Arc Drivers : What we know right now!"

I think the driver support will (and should) influence a lot of purchasing decisions. And I think there is a lot of reader interest in this topic.
I do intend to revisit the Arc cards on occasion. I retest new drivers on AMD and Nvidia probably every four months, or when a new card comes along and I see some big changes in performance. Arc may need even more frequent updating of drivers to see how things are maturing.

The only problem of course is that if Intel were "smart" it would look at any reviews where performance was lower than expected and focus on improving performance specifically in those games. Hopefully, we get more general improvements rather than just targeted fixes, but testing that will require more manpower than one person (me) can provide. :)
 

Co BIY

Splendid
I do intend to revisit the Arc cards on occasion. I retest new drivers on AMD and Nvidia probably every four months, or when a new card comes along and I see some big changes in performance. Arc may need even more frequent updating of drivers to see how things are maturing.

The only problem of course is that if Intel were "smart" it would look at any reviews where performance was lower than expected and focus on improving performance specifically in those games. Hopefully, we get more general improvements rather than just targeted fixes, but testing that will require more manpower than one person (me) can provide. :)

Thanks for answering!

I'd like to hear from the early adopters too.

Maybe if users could list games where they are having problems and where they have no complaints.
 
D

Deleted member 431422

Guest
I'm entertaining the idea of getting one myself, though I want to find out where the A580 is going to land first to see if Intel can tempt me to spend an extra $60 to get that instead.
I'm waiting for A580 as well. AMD's RDNA3 wasn't anything special, Nvidia is too expensive. I'm hoping A580 will have a decent 1080p performance and better price than RTX3050.
 
I'm waiting for A580 as well. AMD's RDNA3 wasn't anything special, Nvidia is too expensive. I'm hoping A580 will have a decent 1080p performance and better price than RTX3050.
I'm sort of wondering if the A580 will still happen. At this point, it seems like Intel may choose to skip it, but it could also announce at CES or something. It's just weird to hold it for this long when it's effectively the same parts as the other Arc cards.
 

InvalidError

Titan
Moderator
I'm sort of wondering if the A580 will still happen.
So late with Arc still having so many teething issues, it may very well have silently gotten unannounced. If Battlemage is really on schedule as Intel claimed it was at the same time it was attempting to explain away Alchemist's delays, Bm may be less than six months away. If Intel truly focuses on the lower-mid range, then the Bm lineup should have a far more compelling and cost-effective entries replace the A380 and A580 with.
 
  • Like
Reactions: JarredWaltonGPU
So late with Arc still having so many teething issues, it may very well have silently gotten unannounced. If Battlemage is really on schedule as Intel claimed it was at the same time it was attempting to explain away Alchemist's delays, Bm may be less than six months away. If Intel truly focuses on the lower-mid range, then the Bm lineup should have a far more compelling and cost-effective entries replace the A380 and A580 with.
Exactly. The only other real possibility is that there are enough ACM-G10 chips that can't meet the A750 spec but can hit A580 spec that Intel may still launch it. But Intel also has several laptop SKUs that could use those chips, so again, totally possible it got silently canned.
 
Status
Not open for further replies.