News Nvidia Ada Lovelace and GeForce RTX 40-Series: Everything We Know

One nitpick with this way of phrasing: "That means the big Infinity Cache gave AMD a 50% boost to effective bandwidth".

The Cache on the GPUs doesn't make it so the card has a higher bandwidth, much like AMD's 3D VCache is not making DDR4 magically have more bandwidth. I know what the implied point is, but I think it shouldn't be explained that way at all. Preventing using the GDDR/DDR BUS to fetch data is not the same as increasing the effective bandwidth of it. You saturate that cache and you're back to using the slow lane. On initial load, you still use the slow lane. Etc...

Other than that, thanks for the information. I do not look forward to 600W GPUs. Ugh.

Regards.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
I reckon 1000w gpu isnt that far away...

Not a good thing for power consumption to keep going up when pple are all talking about climate change and going green
 
  • Like
Reactions: PEnns

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Other than that, thanks for the information. I do not look forward to 600W GPUs. Ugh.
Unless you're shopping for a $2000+ GPU, you're not going to have to worry about 600W any time soon. These new flagships are going to be the equivalent of SLI setups from years ago minus the headaches of needing SLI profiles for proper performance. You'll only need one physical slot, but the cooler is going to take up 4 like old school dual slot card SLI.
 
  • Like
Reactions: Sluggotg

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
I have been using a 1000w PS for many years but it's getting really long on the tooth so I purchased a new 1200w, waiting to get installed one of these days. I don't really like the idea of needing so much power but I remember the days of 2 and 4 card SLI , I used to run and that was excessive. Now a single card can run circles around all that without the driver and game compatibility issues so it is better.
 
  • Like
Reactions: Tom Sunday

DougMcC

Commendable
Sep 16, 2021
115
79
1,660
What are GPUs going to do in the next (50) generation? If power increased this much again, we'd be bumping up against the maximum wattage for a north american wall outlet.
 
What are GPUs going to do in the next (50) generation? If power increased this much again, we'd be bumping up against the maximum wattage for a north american wall outlet.
Until the current generation there had been slight decreases or staying about the same for several generations. It seems after all other consumer electronics have upped their game for decreasing their products power requirements the gpu industry has gone the other way.
 
  • Like
Reactions: RodroX
One nitpick with this way of phrasing: "That means the big Infinity Cache gave AMD a 50% boost to effective bandwidth".

The Cache on the GPUs doesn't make it so the card has a higher bandwidth, much like AMD's 3D VCache is not making DDR4 magically have more bandwidth. I know what the implied point is, but I think it shouldn't be explained that way at all. Preventing using the GDDR/DDR BUS to fetch data is not the same as increasing the effective bandwidth of it. You saturate that cache and you're back to using the slow lane. On initial load, you still use the slow lane. Etc...

Other than that, thanks for the information. I do not look forward to 600W GPUs. Ugh.

Regards.
Cache hits reduce going to GDDR6/GDDR6X memory, which means you get the equivalent of more bandwidth. That's what "effective bandwidth" means. Or put another way, a large L3 cache had a hit rate of something like 50% higher than no L3 cache for AMD, which means 50% of memory accesses that formerly went to the GDDR6 didn't need to go there. AMD even said "effective bandwidth" with some of its RDNA 2 presentations, so if it's good enough for AMD I figure it's good enough for us.

It's the same as saying 16Gbps for GDDR6 memory speeds, with an "effective clock" of 8GHz. Technically, the base GDDR6 clock is actually 2GHz. It then sends eight bits per clock (quad-pumped and DDR), which gives us 16Gbps. That's the data rate, but the clock speed is nowhere near 8GHz. And yet, that's what most GPU utilities will report as the memory speed.
 
Cache hits reduce going to GDDR6/GDDR6X memory, which means you get the equivalent of more bandwidth. That's what "effective bandwidth" means. Or put another way, a large L3 cache had a hit rate of something like 50% higher than no L3 cache for AMD, which means 50% of memory accesses that formerly went to the GDDR6 didn't need to go there. AMD even said "effective bandwidth" with some of its RDNA 2 presentations, so if it's good enough for AMD I figure it's good enough for us.

It's the same as saying 16Gbps for GDDR6 memory speeds, with an "effective clock" of 8GHz. Technically, the base GDDR6 clock is actually 2GHz. It then sends eight bits per clock (quad-pumped and DDR), which gives us 16Gbps. That's the data rate, but the clock speed is nowhere near 8GHz. And yet, that's what most GPU utilities will report as the memory speed.
I know how it works, but it's just not factually correct to portray it like that. Just like MT/s vs the Mhz when reporting the RAM speeds. It is misleading, that's is all. In that light, then if you have a big enough cache, then you have infinite VRAM bandwidth "equivalent" since you'll never use it? That sounds iffy, even if you can draw that parallel.

As I said, it's just a nitpick.

Regards.
 

blppt

Distinguished
Jun 6, 2008
569
89
19,060
I will be happy to snagging a basic RTX 3090 (dreaming it costing me around $700 for a GPU generation almost 2-years old) perhaps next year in January or so? Then a 4K TV as more money becomes available. The RTX 40-series is totally crazy in my view as how much power can one ever need. Besides you are right, it will also not be cheap.

The problem with the 3090 is that even now its not great performing with its "killer feature" (that being raytracing). Better than the AMD RX6900XT, but AAA games that implement raytracing like CP2077 and DL2 pretty much require you to use DLSS to get good framerates at 4k with RT on.

Honestly, if you're rocking a 2xxx series nvidia right now, I wouldn't even bother with the 3 series. Its just not that much better.
 
The problem with the 3090 is that even now its not great performing with its "killer feature" (that being raytracing). Better than the AMD RX6900XT, but AAA games that implement raytracing like CP2077 and DL2 pretty much require you to use DLSS to get good framerates at 4k with RT on.

Honestly, if you're rocking a 2xxx series nvidia right now, I wouldn't even bother with the 3 series. Its just not that much better.
If he means getting "RTX 3090-like performance" rather than an actual RTX 3090, I think waiting for RTX 4070 or 4080 or whatever would be a good plan. The 3070 basically matches or exceeds the performance of the 2080 Ti, at theoretically half the cost. I hope we'll get a card for around $700 that will be faster in most games than an RTX 3090 with Ada. The card might only have 10–12GB of memory, but that should be sufficient.
 
Wow, 600 watts consumer grade GPU on the horizon. This is nuts

Go to wonder, if the cooler on some RTX 3090 can take up to 3.5 slots, this 600watts monster will probably require an hybrid design, or a 5 slot cooler?, And how much hot will the vram run? ..... crazy times.
 
  • Like
Reactions: PEnns and Sluggotg
Wow, 600 watts consumer grade GPU on the horizon. This is nuts

Go to wonder, if the cooler on some RTX 3090 can take up to 3.5 slots, this 600watts monster will probably require an hybrid design, or a 5 slot cooler?, And how much hot will the vram run? ..... crazy times.
The "fix" for the hot running GDDR6X is simply to get them covered by the main vapor chamber that cools the GPU. The memory isn't consuming anywhere near as much power (and thus generating as much heat) as the GPU itself. But most cards only have thermal pads and basically heatspreaders on the memory. The RTX 3090 Ti cards seem to have been tweaked to cool the memory better. Asus has a larger heatsink bracket for just the memory, that has relatively large fins. It's still not going to be as effective as the heatpipes and vapor chambers used on the GPU itself, but it's better than nothing. The RTX 3090 for example had 12 chips on the back of the cards that basically just pressed up against the backplate of the card. There was no active cooling, and while the memory definitely got hot, a wraparound heatsink would have done wonders.

Basically, 3-4 slots with a large heatsink and multiple fans, plus a vapor chamber, should be capable of handling 600W of heat. But skip out on the vapor chamber and it will be problematic.
 
The "fix" for the hot running GDDR6X is simply to get them covered by the main vapor chamber that cools the GPU. The memory isn't consuming anywhere near as much power (and thus generating as much heat) as the GPU itself. But most cards only have thermal pads and basically heatspreaders on the memory. The RTX 3090 Ti cards seem to have been tweaked to cool the memory better. Asus has a larger heatsink bracket for just the memory, that has relatively large fins. It's still not going to be as effective as the heatpipes and vapor chambers used on the GPU itself, but it's better than nothing. The RTX 3090 for example had 12 chips on the back of the cards that basically just pressed up against the backplate of the card. There was no active cooling, and while the memory definitely got hot, a wraparound heatsink would have done wonders.

Basically, 3-4 slots with a large heatsink and multiple fans, plus a vapor chamber, should be capable of handling 600W of heat. But skip out on the vapor chamber and it will be problematic.

Yeah I know, to bad its only now, many months after the launch, that some makers are putting a decent cooling for the vram.
 
$350 for an RTX4050? I'll pass, better off getting an RX6600 for ~$120 less if that is really going to be Nvidia's best offer at entry-level.
As noted in the text elsewhere, all of the stuff on RTX 4070/4060/4050 is pretty speculative. No one has hard details yet, and ultimately it will depend on performance and pricing. I’m just figuring, based on what’s happening right now, that most of the Ada Lovelace GPUs will cost more than their Ampere equivalents. I’m still a bit baffled that RTX 3050 has been selling for well over $300 since its introduction.
 

Math Geek

Titan
Ambassador
I’m still a bit baffled that RTX 3050 has been selling for well over $300 since its introduction.

honestly it's not that hard to believe. so many people are so brand biased they don't even consider the other options. intel and nvidia seem to get the bulk of this blind following so they get to charge more for the "privilege". if everyone simply went with bang for the buck and bought accordingly, i'm sure the pricing would be massively different.
 
  • Like
Reactions: PEnns and King_V
honestly it's not that hard to believe. so many people are so brand biased they don't even consider the other options. intel and nvidia seem to get the bulk of this blind following so they get to charge more for the "privilege". if everyone simply went with bang for the buck and bought accordingly, i'm sure the pricing would be massively different.
RTX 2060 has been readily available for $250 or less for the past two months. It’s still faster than RTX 3050. That’s my real point.
 

Math Geek

Titan
Ambassador
oh yah i get it. it really does not make any sense for it to exist much less at the current price.

i guess the 3000 series being bigger than the 2000 series makes it better?? not really sure what the logic is there :)

the 2060 at $250 or the 6600 at similar or less makes much better sense at this time.
 
  • Like
Reactions: PEnns

InvalidError

Titan
Moderator
I’m still a bit baffled that RTX 3050 has been selling for well over $300 since its introduction.
Are they actually selling though? It could very well be they are remaining on store shelves at inflated prices simply because AIBs got screwed over on parts costs at the time the designs being readied for manufacturing and refuse to take the loss to clear inventory... at least for now.

Looking at the Steam Survey though, I am shocked that the RTX3050 has 6X the RX6600's Steam user share, 3X if you lump XT and non-XT together.
 

lmcnabney

Prominent
Aug 5, 2022
192
190
760
Top 25 on Newegg are all Nvidia cards, and only one is a 3050/Ti all the way down at #17. Highest ranked 3050 on Amazon is at #23. A 3090Ti is surprisingly 1st with a 2nd one in 4th.

That looks like miner numbers.

The hidden message in that data suggests that gamers aren't buying much (and 2022 PC sales numbers back it up). That is a very bad trend to be in. The miners aren't buying now. It will be interesting what Q4 sales data is.