News Nvidia GeForce RTX 3080 Deals: Where to Find Them

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Except for the pre-builts this is my answer to all the other so called deals on RTX 3080 = ROLF :ROFLMAO:

Also, even pre-builts are 2x MSRP or more in many places, including EU.
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
Gave up on finding NV GPUs for my Intel rigs. Going with Intel Arc for those. Being able to combine IGP compute units with dGPU will be grand.
 
Gave up on finding NV GPUs for my Intel rigs. Going with Intel Arc for those. Being able to combine IGP compute units with dGPU will be grand.
Nope, not gonna happen. DG2 is going to be about an order of magnitude faster than the integrated Xe Graphics (UHD 770 or whatever), and completely different feature sets. While it might be possible to have the IGP do something lightweight (XeSS perhaps, or video encoding), anything graphically intensive will not work well trying to balance IGP and DGPU. AMD has basically had an opportunity to make dual graphics work well for over a decade, and it never really happened. The best-case result was slow IGP plus slow dGPU = 50-80% improvement in games with driver support for the tech. The problem is that a single dedicated graphics card could easily beat the dual graphics combination.
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
Nope, not gonna happen. DG2 is going to be about an order of magnitude faster than the integrated Xe Graphics (UHD 770 or whatever), and completely different feature sets. While it might be possible to have the IGP do something lightweight (XeSS perhaps, or video encoding), anything graphically intensive will not work well trying to balance IGP and DGPU. AMD has basically had an opportunity to make dual graphics work well for over a decade, and it never really happened. The best-case result was slow IGP plus slow dGPU = 50-80% improvement in games with driver support for the tech. The problem is that a single dedicated graphics card could easily beat the dual graphics combination.

Care to bet a friendly financial wager on your view that IGP compute units with dGPU won't be combined with Arc on Intel platforms? Reach out via PM, I have $100 US on it.

We'll know soon who is right between the two of us. I have an 11900K and 12900K (latter with no DDR5) and intend to put Arc into my 11900K rig. It's a 720P capable iGPU, which is significant power. I was quoting Intel's CEO BTW. Intel CEO vows to challenge Nvidia, market is "hungry" for alternative GPUs | TechSpot

Pat Gelsinger said:
And even better than that, we're going to make integrated and discrete work together. So if you have three [execution units] worth in the integrated [GPU], then you have 10 EUs worth in the discrete, we're going to give you 13 EUs worth, and you're only going to buy 10 EUs worth in the discrete GPU, and you're going to qualify one product that [works] seamlessly between those two." Well, that’s pretty differentiated. And that's just one example.

So for me to be wrong, you're saying the CEO of Intel, an engineer, is a fool that is "not gonna make it happen". You guys can laugh it up all you want, but I'll be taking your money soon. Comparing AMD's efforts to Intel is your second mistake here. It's generous to even mention those two companies in the same sentence, let alone look to AMD on what's possible or will happen at Intel.
 
Care to bet a friendly financial wager on your view that IGP compute units with dGPU won't be combined with Arc on Intel platforms? Reach out via PM, I have $100 US on it.

We'll know soon who is right between the two of us. I have an 11900K and 12900K (latter with no DDR5) and intend to put Arc into my 11900K rig. It's a 720P capable iGPU, which is significant power. I was quoting Intel's CEO BTW. Intel CEO vows to challenge Nvidia, market is "hungry" for alternative GPUs | TechSpot

So for me to be wrong, you're saying the CEO of Intel, an engineer, is a fool that is "not gonna make it happen". You guys can laugh it up all you want, but I'll be taking your money soon. Comparing AMD's efforts to Intel is your second mistake here. It's generous to even mention those two companies in the same sentence, let alone look to AMD on what's possible or will happen at Intel.
I'm not saying he's a fool, I'm saying asymmetrical GPUs configured in any way similar to CrossFire or SLI have never worked out well in practice. There are a few edge cases where you get "perfect" scaling, but most of the time drivers and other stuff gets in the way and you end up with a worse overall experience — microstutter and framerate inconsistencies being the best-case usually. As I note, doing something like XeSS or Quick Sync of some other workload on the secondary iGPU is feasible and might even be interesting. Doing actual real-time 3D graphics rendering where you're trying to split the workload 80/20 or 90/10 ends up being bad, because every frame differs.

I would love for Intel to somehow prove AMD and Nvidia wrong. But to be clear, doing compute on both iGPU and dGPU or any other non-3D-gaming workload does not count. EMIB with dual GPUs to get the experience of a single large GPU might work, but that won't be between iGPU and dGPU. Anyway, is that $100 for Intel to just make anything work between iGPU and dGPU? Because if so, yes, I think that will probably happen in some form. But if it's $100 to get universal gaming support with current UHD 770 and Arc GPUs with split-frame rendering balancing the workload and a resulting performance boost that doesn't have a bunch of potential negative side-effects? That is not going to happen and I'd take your bet. XeSS seems the most likely candidate, or some other form of post processing.
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
Not sure about all the additional qualifications for a simple bet but regardless a top of the line Arc will have 512EUs and an 11900K has 32EUs. That's potentially a 16% performance boost. Being on the CPU is an advantage over SLI, so I don't know if it's comparable. I'd definitely never use AMD as my example of what's possible with GPUs though. I pretty consider this a fresh start with no precedent, given it's AMD as our precursor for dGPU+iGPU interaction. They can barely get anything working at all. I don't like wasted silicon, and welcome Intel's efforts. We'll see.
 
Not sure about all the additional qualifications for a simple bet but regardless a top of the line Arc will have 512EUs and an 11900K has 32EUs. That's potentially a 16% performance boost. Being on the CPU is an advantage over SLI, so I don't know if it's comparable. I'd definitely never use AMD as my example of what's possible with GPUs though. I pretty consider this a fresh start with no precedent, given it's AMD as our precursor for dGPU+iGPU interaction. They can barely get anything working at all. I don't like wasted silicon, and welcome Intel's efforts. We'll see.
Look, Arc will have second generation Xe architecture, UHD 770 has first gen. That right there is enough to create a host of issues for anything more than doing stuff like post processing or XeSS. Intel can try to do more than that, but it will at most end up very limited in both benefit and adoption among games.
 
Care to bet a friendly financial wager on your view that IGP compute units with dGPU won't be combined with Arc on Intel platforms? Reach out via PM, I have $100 US on it.

We'll know soon who is right between the two of us. I have an 11900K and 12900K (latter with no DDR5) and intend to put Arc into my 11900K rig. It's a 720P capable iGPU, which is significant power. I was quoting Intel's CEO BTW. Intel CEO vows to challenge Nvidia, market is "hungry" for alternative GPUs | TechSpot



So for me to be wrong, you're saying the CEO of Intel, an engineer, is a fool that is "not gonna make it happen". You guys can laugh it up all you want, but I'll be taking your money soon. Comparing AMD's efforts to Intel is your second mistake here. It's generous to even mention those two companies in the same sentence, let alone look to AMD on what's possible or will happen at Intel.
The type of dissimilar GPU performance scaling you're referring to is a pipe dream.
Edge cases, in datacenters, with very specific data being processed may eventually happen, but in gaming? Not gonna happen for a long while - if ever.
 
  • Like
Reactions: JarredWaltonGPU

TRENDING THREADS