Though they are hard to come by, there are still RTX 3080 deals to be found this holiday season.
Nvidia GeForce RTX 3080 Deals: Where to Find Them : Read more
Nvidia GeForce RTX 3080 Deals: Where to Find Them : Read more
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Nope, not gonna happen. DG2 is going to be about an order of magnitude faster than the integrated Xe Graphics (UHD 770 or whatever), and completely different feature sets. While it might be possible to have the IGP do something lightweight (XeSS perhaps, or video encoding), anything graphically intensive will not work well trying to balance IGP and DGPU. AMD has basically had an opportunity to make dual graphics work well for over a decade, and it never really happened. The best-case result was slow IGP plus slow dGPU = 50-80% improvement in games with driver support for the tech. The problem is that a single dedicated graphics card could easily beat the dual graphics combination.Gave up on finding NV GPUs for my Intel rigs. Going with Intel Arc for those. Being able to combine IGP compute units with dGPU will be grand.
Gave up on finding NV GPUs for my Intel rigs. Going with Intel Arc for those. Being able to combine IGP compute units with dGPU will be grand.
More like a moped. LOLEnlisting a full sized truck and a smart car to pull your boat along the highway together isn't going to end well.
Nope, not gonna happen. DG2 is going to be about an order of magnitude faster than the integrated Xe Graphics (UHD 770 or whatever), and completely different feature sets. While it might be possible to have the IGP do something lightweight (XeSS perhaps, or video encoding), anything graphically intensive will not work well trying to balance IGP and DGPU. AMD has basically had an opportunity to make dual graphics work well for over a decade, and it never really happened. The best-case result was slow IGP plus slow dGPU = 50-80% improvement in games with driver support for the tech. The problem is that a single dedicated graphics card could easily beat the dual graphics combination.
Pat Gelsinger said:And even better than that, we're going to make integrated and discrete work together. So if you have three [execution units] worth in the integrated [GPU], then you have 10 EUs worth in the discrete, we're going to give you 13 EUs worth, and you're only going to buy 10 EUs worth in the discrete GPU, and you're going to qualify one product that [works] seamlessly between those two." Well, that’s pretty differentiated. And that's just one example.
I'm not saying he's a fool, I'm saying asymmetrical GPUs configured in any way similar to CrossFire or SLI have never worked out well in practice. There are a few edge cases where you get "perfect" scaling, but most of the time drivers and other stuff gets in the way and you end up with a worse overall experience — microstutter and framerate inconsistencies being the best-case usually. As I note, doing something like XeSS or Quick Sync of some other workload on the secondary iGPU is feasible and might even be interesting. Doing actual real-time 3D graphics rendering where you're trying to split the workload 80/20 or 90/10 ends up being bad, because every frame differs.Care to bet a friendly financial wager on your view that IGP compute units with dGPU won't be combined with Arc on Intel platforms? Reach out via PM, I have $100 US on it.
We'll know soon who is right between the two of us. I have an 11900K and 12900K (latter with no DDR5) and intend to put Arc into my 11900K rig. It's a 720P capable iGPU, which is significant power. I was quoting Intel's CEO BTW. Intel CEO vows to challenge Nvidia, market is "hungry" for alternative GPUs | TechSpot
So for me to be wrong, you're saying the CEO of Intel, an engineer, is a fool that is "not gonna make it happen". You guys can laugh it up all you want, but I'll be taking your money soon. Comparing AMD's efforts to Intel is your second mistake here. It's generous to even mention those two companies in the same sentence, let alone look to AMD on what's possible or will happen at Intel.
Look, Arc will have second generation Xe architecture, UHD 770 has first gen. That right there is enough to create a host of issues for anything more than doing stuff like post processing or XeSS. Intel can try to do more than that, but it will at most end up very limited in both benefit and adoption among games.Not sure about all the additional qualifications for a simple bet but regardless a top of the line Arc will have 512EUs and an 11900K has 32EUs. That's potentially a 16% performance boost. Being on the CPU is an advantage over SLI, so I don't know if it's comparable. I'd definitely never use AMD as my example of what's possible with GPUs though. I pretty consider this a fresh start with no precedent, given it's AMD as our precursor for dGPU+iGPU interaction. They can barely get anything working at all. I don't like wasted silicon, and welcome Intel's efforts. We'll see.
The type of dissimilar GPU performance scaling you're referring to is a pipe dream.Care to bet a friendly financial wager on your view that IGP compute units with dGPU won't be combined with Arc on Intel platforms? Reach out via PM, I have $100 US on it.
We'll know soon who is right between the two of us. I have an 11900K and 12900K (latter with no DDR5) and intend to put Arc into my 11900K rig. It's a 720P capable iGPU, which is significant power. I was quoting Intel's CEO BTW. Intel CEO vows to challenge Nvidia, market is "hungry" for alternative GPUs | TechSpot
So for me to be wrong, you're saying the CEO of Intel, an engineer, is a fool that is "not gonna make it happen". You guys can laugh it up all you want, but I'll be taking your money soon. Comparing AMD's efforts to Intel is your second mistake here. It's generous to even mention those two companies in the same sentence, let alone look to AMD on what's possible or will happen at Intel.