Question I'm looking at an Intel Arc A750 to upgrade my rig for 1080p Ray-Tracing. Seems to be good value, but I don't know anything about Intel video cards

warhammer3025

Distinguished
Dec 2, 2010
130
4
18,685
Most websites I've seen with a "best budget card for ray-tracing" article seem to list cards for $500 or more as "budget". Granted, with the prices of the last few years those could well be on the low end, but they're still more than I'm comfortable spending.

I saw a mention of the Intel Arc A750 being an entry point for ray-tracing at 1080p, but being the miser I am I'd like to get opinions before committing to even a $250 card like that.
 
Most websites I've seen with a "best budget card for ray-tracing" article seem to list cards for $500 or more as "budget". Granted, with the prices of the last few years those could well be on the low end, but they're still more than I'm comfortable spending.

I saw a mention of the Intel Arc A750 being an entry point for ray-tracing at 1080p, but being the miser I am I'd like to get opinions before committing to even a $250 card like that.

It depends on your pain point when it comes to FPS. Personally, I'd call the entry-point significantly higher than that given the difficulty of even getting 50 FPS from the Intel GPUs with full ray tracing on at 1080p.

To be perfectly honest, ray-tracing is a a lot like 4K was five or six years ago: too early to consider budget-friendly GPUs as a really good entry point.
 
It depends on your pain point when it comes to FPS. Personally, I'd call the entry-point significantly higher than that given the difficulty of even getting 50 FPS from the Intel GPUs with full ray tracing on at 1080p.

To be perfectly honest, ray-tracing is a a lot like 4K was five or six years ago: too early to consider budget-friendly GPUs as a really good entry point.
I figure I need to update soon anyway - I don't meet minimum specs on Resident Evil 4 Remake, and if there's one modern game I don't meet the specs for then soon there'll be another and another. My rig is at least 6-7 years old, so it's time to upgrade - I'd also like to check out the ray-tracing buzz that everyone's on about, even if it's just a once-in-a-while thing on games like Cyberpunk.

No way am I going to go for a full $800 card, but I can stretch the budget for a $350 one :)
 
If you are considering Intel, the lowest you should consider is their best. 8GB cards are rapidly becoming obsolete and in some cases unusable.

Arc A770 16GB is the only card worth considering in the Intel line. $350 on the money.

Otherwise I'd find a used 30-series locally that hasn't been mined on.
 
  • Like
Reactions: Roland Of Gilead
Most websites I've seen with a "best budget card for ray-tracing" article seem to list cards for $500 or more as "budget". Granted, with the prices of the last few years those could well be on the low end, but they're still more than I'm comfortable spending.

I saw a mention of the Intel Arc A750 being an entry point for ray-tracing at 1080p, but being the miser I am I'd like to get opinions before committing to even a $250 card like that.
Arc 770 is better for future proofing, with its 16GB VRAM. However I would suggest waiting until Battlemage to see if you can get better cards. If you wanted to upgrade now then sure, but try to get a card with at least 12GB, as 8GB is become obsolete. There have been 8GB consumer cards as early as 2015, and I have no idea why Nvidia decided to only give the 3070s 8GB. You're better off with 3060 for ray tracing and DLSS and the 6600s/6700s for pure rasterization. For ray tracing get the 3060, if not get the 6700XT, it's cheap af right now.
 
  • Like
Reactions: warhammer3025
Arc 770 is better for future proofing, with its 16GB VRAM. However I would suggest waiting until Battlemage to see if you can get better cards. If you wanted to upgrade now then sure, but try to get a card with at least 12GB, as 8GB is become obsolete. There have been 8GB consumer cards as early as 2015, and I have no idea why Nvidia decided to only give the 3070s 8GB. You're better off with 3060 for ray tracing and DLSS and the 6600s/6700s for pure rasterization. For ray tracing get the 3060, if not get the 6700XT, it's cheap af right now.
How much VRAM a video card has doesn't necessarily make it last longer. The only thing that more VRAM will get you in the long haul is higher texture quality because that's the only thing that's free. Everything else that gobbles up VRAM requires more GPU power to use.

Given that most people seem to favor FPS over graphics quality, VRAM shouldn't be the end-all-be-all at the lower to mid range area. I mean yes, there should be a minimum, but 16GB (at the moment) isn't going to do squat for a lower midrange GPU in terms of longevity.
 
^futureproofing wasn't the right word for the person to use but maybe it's a language barrier.

No it's not futureproofing but it is absolutely the Minimum. HWUnboxed just released a video showing how 8GB struggles in many games ranging from ugly looking to unplayable.

If one is considering spending money, you don't buy something obsolete from the shelf.
 
How much VRAM a video card has doesn't necessarily make it last longer. The only thing that more VRAM will get you in the long haul is higher texture quality because that's the only thing that's free. Everything else that gobbles up VRAM requires more GPU power to use.

Given that most people seem to favor FPS over graphics quality, VRAM shouldn't be the end-all-be-all at the lower to mid range area. I mean yes, there should be a minimum, but 16GB (at the moment) isn't going to do squat for a lower midrange GPU in terms of longevity.
Also, different GPU architectures use VRAM with varying efficiency, that's also something many don't consider despite being proven time and again. It was shown in testing multiple times over the past couple years that Nvidia cards use less VRAM than both AMD and Intel. I have posted the foto a couple of times by now, but the below illustrates it nicely taking two newer games as an example.

View: https://imgur.com/gallery/pTPuJLW


4K resolution with Raytracing for Spiderman/without for Warhammer, the table shows which card, how much VRAM it has, and how much it uses, at max settings and then with reduced texture quality. Last row states if there were issues or not, for both maximum and reduced settings. Both the 6650XT and the 7900XTX use more VRAM than any of the Nvidia cards. Very interesting is the result for the 3080Ti in Total War Warhammer, though, especially compared to the 4070Ti. It had stuttering and used more VRAM than the 4070Ti, which had no issues at all. My theory is that the bigger cache really helps the 4070Ti in that game. And 4K isn't even what the card is advertised for. It doesn't use more 9GB at 1440p.

Btw, the A750 did have stuttering in both Uncharted 4 and A Plague Tale Requiem in that test at 1080p, while all Nvidia and AMD cards with 6GB and more were fine; the 6600 and 7900XTX again used more VRAM than the rest, though, except for the A750. It only used 6.4GB VRAM in Plague Tale, though, so this might be an issue with the architecture rather than not enough VRAM. Uncharted, though, it used 7.55GB, so that was definitely the issue there.
 
No it's not futureproofing but it is absolutely the Minimum. HWUnboxed just released a video showing how 8GB struggles in many games ranging from ugly looking to unplayable.
And that brings up a few criticisms I have with the video:
  • I would argue the RX 6800 isn't necessarily in the same performance class as the RTX 3070, especially when you consider the initial MSRP of the RX 6800 at about $80 higher than the RTX 3070. The RX 6750 XT would've been a better card to test with
  • The conclusion that 16GB is the bare minimum I don't buy either. 12GB would be a better number here, if that (though they did say a 12GB card would "age better")
  • If you want to talk about VRAM consumption alone, then I wouldn't really consider using two different cards to begin with, as there could be other things that affect the results. Instead, I would've taken the RX 6800 and found a way to tie up the VRAM in a predictable, repeatable way and start testing performance loss at various points. You can do this with an NVIDIA card via a CUDA app. I can't see how hard it would be to do the same on an AMD card.
  • There's also the question of the game itself in if it was just a sloshed job to get something out the door and more VRAM simply hides these deficits
I also don't really care for Hardware Unboxed's content because it always feels like they have an AMD bias. Heck I remember in a video card review at Tech Spot (their parent site) that they said they don't care about ray tracing, so it wasn't going to be tested (though unfortunately I can't remember the exact review)
 
  • Like
Reactions: Why_Me and KyaraM
And that brings up a few criticisms I have with the video:
  • I would argue the RX 6800 isn't necessarily in the same performance class as the RTX 3070, especially when you consider the initial MSRP of the RX 6800 at about $80 higher than the RTX 3070. The RX 6750 XT would've been a better card to test with
  • The conclusion that 16GB is the bare minimum I don't buy either. 12GB would be a better number here, if that (though they did say a 12GB card would "age better")
  • If you want to talk about VRAM consumption alone, then I wouldn't really consider using two different cards to begin with, as there could be other things that affect the results. Instead, I would've taken the RX 6800 and found a way to tie up the VRAM in a predictable, repeatable way and start testing performance loss at various points. You can do this with an NVIDIA card via a CUDA app. I can't see how hard it would be to do the same on an AMD card.
  • There's also the question of the game itself in if it was just a sloshed job to get something out the door and more VRAM simply hides these deficits
I also don't really care for Hardware Unboxed's content because it always feels like they have an AMD bias. Heck I remember in a video card review at Tech Spot (their parent site) that they said they don't care about ray tracing, so it wasn't going to be tested (though unfortunately I can't remember the exact review)

AMD BIAS? They absolutely destroyed AMD in their latest video Conclusively naming Nvidia king of scaler performance tech.

Anyway, OP said they want to go Intel route, so I'm not even lookin gat the video. Intel you get ether 8 or 16GB. 8GB should not even be up for consideration.
 
Last edited:
I also don't really care for Hardware Unboxed's content because it always feels like they have an AMD bias. Heck I remember in a video card review at Tech Spot (their parent site) that they said they don't care about ray tracing, so it wasn't going to be tested (though unfortunately I can't remember the exact review)
There's a reason a lot of us call it 'AMD Unboxed'.
 
  • Like
Reactions: warhammer3025
I don't think that the RTX 3060 is a good idea. The RX 6700 XT is a massive 27% faster, also has 12GB of VRAM and costs about the same:
MSI GeForce RTX 3060 Ventus 2X 12GB: $360 (-$20 MIR)
ASRock Radeon RX 6700 XT Challenger D 12GB: $350

There's no reason to get the RTX 3060 in this situation.
I kinda discounted ATI cards because I heard that Nvidia generally did better in ray-tracing, which is something I've been looking forward to trying. But you've given me something to consider - I'll do some more comparisons between those cards.
 
  • Like
Reactions: Avro Arrow
I kinda discounted ATI cards because I heard that Nvidia generally did better in ray-tracing, which is something I've been looking forward to trying. But you've given me something to consider - I'll do some more comparisons between those cards.
On average, yes, nVidia cards do better in RT but the thing is, unless you have a halo-level GeForce card like the RTX 3080 Ti (or better), RT is still pretty bad on them. See, the RTX 3060 is technically better at RT than the RX 6700 XT but it's still terrible at it. We're talking 25FPS instead of 20 and who really cares about that? I guarantee you that if you purchased an RTX 3060, you wouldn't be using RT.

The reason I said you'd need at least an RTX 3090 is because, while the RTX 3070-RTX 3080 are far better at RT because of their more potent GPUs, RT uses a lot of VRAM. Already we're seeing RT crippling the 3060 Ti - 3070 Ti cards at 1080p because of their tiny 8GB frame buffers. Trying RT on an RTX 3080 at 1440p can be sketchy too because it only has 10GB.

Basically, nVidia has set up a no-win scenario for anything below the RTX 3080 Ti. This is because you need at least 12GB of VRAM to properly implement it and the only card with 12GB (RTX 3060) doesn't have a potent enough GPU. Meanwhile, the more powerful cards don't have enough VRAM so it's the ultimate catch-22.

While it's true that nVidia has an RT advantage, that doesn't mean that Radeons can't do RT, they just typically see a much larger drop in FPS when it's enabled. However, as you'll see, that 8GB of VRAM is absolutely crippling to the RTX 3070. Here are some Hogwarts: Legacy RT performance tables from Techspot:

1440p:
RT_1440p-color-p.webp

At 1440p, the RX 6800 XT isn't by any means great with an average FPS of 39 with a 1% low of 33 but it absolutely kills the RTX 3070's average FPS of 17 with a 1% low of 5. The RTX 3070 is a complete stuttery mess. You'll notice that the RTX 3060, with it's 12GB of VRAM is far better. Not better than the RX 6800 XT, but far better than any card with only 8GB. Even the 10GB RTX 3080 is showing signs of struggling with its 1% low and average FPS being much farther apart than the 16GB Radeons.

1080p:
RT_1080p-color-p.webp

Even at 1080p, the 8GB cards are still stuttery as hell with RT on, but maybe somewhat playable (although I wouldn't consider it playable). Again, the RX 6800 XT just destroys every GeForce card below the RTX 3080.

In normal gaming performance, the RX 6800 XT manages to keep its 1% minimum above 60FPS while the RTX 3070 is unable to do so:
Ultra_1440p-color-p.webp

Despite all this, the RX 6800 XT is only $10 more expensive:
Gigabyte GeForce RTX 3070 Gaming OC 8GB - $530
ASRock Radeon RX 6800 XT Phantom Gaming 16GB - $540

If you decided you wanted to aim higher, things get even worse for GeForce as the RTX 3070 Ti is only $10 less expensive than the RX 6950 XT ($30 less when the sale is over, which is still ridiculous):
PNY Verto GeForce RTX 3070 Ti 8GB - $600
ASRock Radeon RX 6950 XT 16GB - $630 (On sale for $610)

I don't ask people to "take my word for it", I just pass on information that I've come across. I would recommend not automatically thinking of GeForce when considering a new video card because there is always another option. I think that it's a better option (I've owned ten straight Radeons) because you get much more for the same money spent.

Ultimately, the choice is yours but I think that you know which direction to take. 😉
 
  • Like
Reactions: warhammer3025
On average, yes, nVidia cards do better in RT but the thing is, unless you have a halo-level GeForce card like the RTX 3080 Ti (or better), RT is still pretty bad on them. See, the RTX 3060 is technically better at RT than the RX 6700 XT but it's still terrible at it. We're talking 25FPS instead of 20 and who really cares about that? I guarantee you that if you purchased an RTX 3060, you wouldn't be using RT.

The reason I said you'd need at least an RTX 3090 is because, while the RTX 3070-RTX 3080 are far better at RT because of their more potent GPUs, RT uses a lot of VRAM. Already we're seeing RT crippling the 3060 Ti - 3070 Ti cards at 1080p because of their tiny 8GB frame buffers. Trying RT on an RTX 3080 at 1440p can be sketchy too because it only has 10GB.

Basically, nVidia has set up a no-win scenario for anything below the RTX 3080 Ti. This is because you need at least 12GB of VRAM to properly implement it and the only card with 12GB (RTX 3060) doesn't have a potent enough GPU. Meanwhile, the more powerful cards don't have enough VRAM so it's the ultimate catch-22.

While it's true that nVidia has an RT advantage, that doesn't mean that Radeons can't do RT, they just typically see a much larger drop in FPS when it's enabled. However, as you'll see, that 8GB of VRAM is absolutely crippling to the RTX 3070. Here are some Hogwarts: Legacy RT performance tables from Techspot:

1440p:
RT_1440p-color-p.webp

At 1440p, the RX 6800 XT isn't by any means great with an average FPS of 39 with a 1% low of 33 but it absolutely kills the RTX 3070's average FPS of 17 with a 1% low of 5. The RTX 3070 is a complete stuttery mess. You'll notice that the RTX 3060, with it's 12GB of VRAM is far better. Not better than the RX 6800 XT, but far better than any card with only 8GB. Even the 10GB RTX 3080 is showing signs of struggling with its 1% low and average FPS being much farther apart than the 16GB Radeons.

1080p:
RT_1080p-color-p.webp

Even at 1080p, the 8GB cards are still stuttery as hell with RT on, but maybe somewhat playable (although I wouldn't consider it playable). Again, the RX 6800 XT just destroys every GeForce card below the RTX 3080.

In normal gaming performance, the RX 6800 XT manages to keep its 1% minimum above 60FPS while the RTX 3070 is unable to do so:
Ultra_1440p-color-p.webp

Despite all this, the RX 6800 XT is only $10 more expensive:
Gigabyte GeForce RTX 3070 Gaming OC 8GB - $530
ASRock Radeon RX 6800 XT Phantom Gaming 16GB - $540

If you decided you wanted to aim higher, things get even worse for GeForce as the RTX 3070 Ti is only $10 less expensive than the RX 6950 XT ($30 less when the sale is over, which is still ridiculous):
PNY Verto GeForce RTX 3070 Ti 8GB - $600
ASRock Radeon RX 6950 XT 16GB - $630 (On sale for $610)

I don't ask people to "take my word for it", I just pass on information that I've come across. I would recommend not automatically thinking of GeForce when considering a new video card because there is always another option. I think that it's a better option (I've owned ten straight Radeons) because you get much more for the same money spent.

Ultimately, the choice is yours but I think that you know which direction to take. 😉
Is that still the old graph from long before they fixed the memory leaks of both VRAM and RAM? Because it sure looks like it, especially considering that no other reviewer gets such terrible results on the 3070 and 3070Ti ad the 3070Ti literally gets the results listed for 1080p there in 1440p for me. Which I told you before, btw. With link to the article where it was tested, even. And that was also from before the patch fixing the memory leakage... a fix I pointed out to you as well, over a month ago.

I have to admit, I find this post quite ironic. On one side, you act like yours is the only truth and people who run Nvidia cards are "intellectualy lazy" and "sheep" for doing so.
https://forums.tomshardware.com/thr...head-of-rtx-4070-launch.3803366/post-22978857

On the other hand, even when faced with evidence several times, you yourself seem unable to amend your stance even though you claim otherwise. I will present the evidence again here, together with new one. I don't really think it will reach you at this point, but hopefully it will reach other people here.

https://forums.tomshardware.com/threads/fps-drops-dramatically-rtx-3070.3796690/post-22936153
To pull the most important result out for you, 1440p, RTX on, DLSS off, 38 FPS average, not 17 FPS in my own testing of the matter.

Here I quoted the patch notes where they fixed VRAM use, after the test I did:
https://forums.tomshardware.com/threads/fps-drops-dramatically-rtx-3070.3796690/post-22958830


The third link is from a test of the 4070, sure, but does have the 3070 and 3070Ti in it and also shows vastly better performance. There is also my time honored table showing the 6950XT usinf 15GB VRAM in certain games while the 4070Ti uses under 11GB, showing that AMD often does need more VRAM than Nvidia, another thing you tend to ignore.

There is also the fact that you completely ignore all evidence that Nvidia cards are more power efficient this generation, which is very important to pretty much the entire western world outside the US. It is great for you that electricity is cheap for you; but this far from the truth for many, many others, and ignoring it is, well. Intellectually lazy I guess? AMD is cheaper to buy, yes, especially the old cards. But they often use so much more power while doing that, that they effectively negate all advantages of that. Once again I ran the numbers today, and even when using a rather moderate wattsge price of 0.33€, which is low where I live, those cards only take 2 years to effectively cost you more than a new Nvidia card. Many people use GPUs for longer than that.

In summary. The reason why I find this very ironic is because you call others intellectually lazy and outright brand wh***s, yet I have only ever seen you throw around a single source with a known bias against Nvidia. That makes you seem quite set in your ways as well.
 
DerBauer concluded also that 12GB (2 videos ago?) wasn't sufficient and shouldn't be up for consideration.

I have Nvidia and AMD cards. I prefer Nvidia solely because DLSS outperforms and is more widely available than FSR (though more games are supporting FSR these days).

Just don't use RT and either Nvidia or AMD will do fine. Or go in blind and hope Intels A770 16GB performs well enough for your needs.