News Nvidia RTX 4090 Ti and Titan RTX Ada: Everything We Know

JDJJ

Commendable
Sep 23, 2022
12
3
1,515
Disappointing that they didn’t discuss the DisplayPort issue. The RTX 4090 is already able to saturate the bandwidth of its DP 1.4a interface. A more powerful card would be aimed at either gamers willing to spend for a top-tier experience, or professionals looking needing to work in high resolutions and highly accurate (large) color spaces. Nvidia will need to upgrade to the current DP 2.1 standard in order for either card to make sense for either use case, to justify the increased cost over 4090, or 6000 Ada.

Also, in terms of cooling a 4090 Ti, which would primarily be aimed at gamers, it seems unlikely that anyone would actually air cool these. If you are willing and able to splash out for a 4090 Ti with its nosebleed prices, you will probably be fine with the relatively low cost of cooling it with a custom water cooling loop. This would give you a single-slot card without any concern over thermals. Ideally, Nvidia would release the 4090 Ti with a preinstalled water block. This would align the card with its most likely use case, and also reduce a lot of waste by not making massive coolers that nobody will use and will end up in landfills. I know there will be idiot reporters who will freak out about the implications of releasing a card that requires additional parts to complete the cooling solution, but for a halo card, that really shouldn’t be a concern. Plus it is reasonable to assume it would be way cheaper to make a waterblock than it would be to make the monstrous air cooler such a high TDP card would require. That might enable Nvidia to keep the price a bit lower, or increase margins, or both. Board partners could always choose to make an AIO solution for the even tinier niche of people with enough money to buy this halo card, but neither the brains, time, or initiative to set up a custom loop.

Finally, air cooling it could prove controversial if many cases can’t handle all that heat being dumped into their internal space. Judging by case reviews from the likes of Gamers Nexus, enclosed cases with thermals good enough to handle exhausting that much hot air are very rare, perhaps non-existant. Nvidia could side-step this potential “scandal” completely by just making it a water-cooled card.

Last, a Founders Edition waterblock, taking styling cues from the shrouds on their FE air coolers, might look pretty sweet!
 
  • Like
Reactions: UnCertainty08

btmedic04

Distinguished
Mar 12, 2015
486
383
19,190
what nvidia will launch will be highly dependent on what amd does with a potential rx 7950 xtx. the thing is, the 7900xtx isn't as competitive with the 4090 as the 6900xt was with the 3090 and is positioned by amd as a competitor for the 4080. a titan glass gpu would only compete with rtx ada6000 parts and would cut into profits associated with that class of gpu, and believe me, Jensen would much rather sell that silicon at $6000 instead of $3000. with that said, I dont see a 4090ti or titan releasing this year at all. It will only come to be if amd launches a 7950xtx that offers comparable or greater performance than the current 4090
 

_dawn_chorus_

Distinguished
Aug 30, 2017
563
56
19,090
Disappointing that they didn’t discuss the DisplayPort issue. The RTX 4090 is already able to saturate the bandwidth of its DP 1.4a interface. A more powerful card would be aimed at either gamers willing to spend for a top-tier experience, or professionals looking needing to work in high resolutions and highly accurate (large) color spaces. Nvidia will need to upgrade to the current DP 2.1 standard in order for either card to make sense for either use case, to justify the increased cost over 4090, or 6000 Ada.

Also, in terms of cooling a 4090 Ti, which would primarily be aimed at gamers, it seems unlikely that anyone would actually air cool these. If you are willing and able to splash out for a 4090 Ti with its nosebleed prices, you will probably be fine with the relatively low cost of cooling it with a custom water cooling loop. This would give you a single-slot card without any concern over thermals. Ideally, Nvidia would release the 4090 Ti with a preinstalled water block. This would align the card with its most likely use case, and also reduce a lot of waste by not making massive coolers that nobody will use and will end up in landfills. I know there will be idiot reporters who will freak out about the implications of releasing a card that requires additional parts to complete the cooling solution, but for a halo card, that really shouldn’t be a concern. Plus it is reasonable to assume it would be way cheaper to make a waterblock than it would be to make the monstrous air cooler such a high TDP card would require. That might enable Nvidia to keep the price a bit lower, or increase margins, or both. Board partners could always choose to make an AIO solution for the even tinier niche of people with enough money to buy this halo card, but neither the brains, time, or initiative to set up a custom loop.

Finally, air cooling it could prove controversial if many cases can’t handle all that heat being dumped into their internal space. Judging by case reviews from the likes of Gamers Nexus, enclosed cases with thermals good enough to handle exhausting that much hot air are very rare, perhaps non-existant. Nvidia could side-step this potential “scandal” completely by just making it a water-cooled card.

Last, a Founders Edition waterblock, taking styling cues from the shrouds on their FE air coolers, might look pretty sweet!

Ackchyually...
The 4090 has fantastic cooling already. What makes you think a ~10% beefier version would necessitate being water cooled?
 
Last edited:

Warrior24_7

Distinguished
Nov 21, 2011
35
19
18,535
Rumors are swirling about an upcoming Nvidia GeForce RTX 4090 Ti, or potentially a new Titan RTX — maybe even both! Here's everything we know about the future uber-GPUs of the Ada Lovelace generation.

Nvidia RTX 4090 Ti and Titan RTX Ada: Everything We Know : Read more
what nvidia will launch will be highly dependent on what amd does with a potential rx 7950 xtx. the thing is, the 7900xtx isn't as competitive with the 4090 as the 6900xt was with the 3090 and is positioned by amd as a competitor for the 4080. a titan glass gpu would only compete with rtx ada6000 parts and would cut into profits associated with that class of gpu, and believe me, Jensen would much rather sell that silicon at $6000 instead of $3000. with that said, I dont see a 4090ti or titan releasing this year at all. It will only come to be if amd launches a 7950xtx that offers comparable or greater performance than the current 4090

No it won’t. Nvidia doesn’t care about AMD. Nvidia commands over 90% market share! AMD actually LOST market share! This IS NOT competitive. AMD is not a true rival. Nvidia does what it wants, and charges what it wants because it owns the market. This whole Nvidia vs AMD is mainly a AMD fanboy argument. AMD’s cards and drivers are horrible. AMD fanboys use Nvidia cards! Thinking that AMD dictates what Nvidia does is absolutely absurd.
 

jp7189

Distinguished
Feb 21, 2012
532
303
19,260
If they launch a 48GB card for under $3000 it will fly off the shelves. VRAM is in high demand for small companies/small budgets tinkering with AI training. For bigger budgets this still won't compete with A6000, because ironically with a 300W power cap you can cram 4x A6000's in a single chassis which is better than 2x 600W cards for most work.
 
  • Like
Reactions: JarredWaltonGPU
Disappointing that they didn’t discuss the DisplayPort issue. The RTX 4090 is already able to saturate the bandwidth of its DP 1.4a interface. A more powerful card would be aimed at either gamers willing to spend for a top-tier experience, or professionals looking needing to work in high resolutions and highly accurate (large) color spaces. Nvidia will need to upgrade to the current DP 2.1 standard in order for either card to make sense for either use case, to justify the increased cost over 4090, or 6000 Ada.
A lot of people are trying to make the DP1.4a feature into more of a problem than it is. The facts are that, with Display Stream Compression, you can do 4K 240Hz with "visually lossless" results. I actually have a monitor, Samsung Neo G8 32, that supports 4K 240Hz via DSC. In fact, the monitor explicitly does not support DP2.x and non-DSC for that resolution. So you'd need to find a monitor that explicitly supports DP2.1 for it to even matter. That's part of the issue.

The other part of the issue is perhaps even more telling. The Neo G8 32 works fine with the RTX 40-series, RTX 30-series, and RTX 20-series (and I think GTX 16-series) at 4K 240Hz with DSC. I haven't tested every older GPU with it, but what I can say for certain is that Intel's Arc GPUs (supposedly DP2.x with 54Gbps) have problems with 4K 240Hz and can only work properly at 4K 120Hz. I don't know if their DSC support somehow fails at 240Hz or what (it seems like there's a gap of maybe ten pixels in the center that gets omitted at 240 Hz). Likewise, many of the previous generation AMD GPUs also have issues with doing more than 4K 120Hz.

Now, perhaps it's the monitor's DSC that's at fault, but the fact that it works with Nvidia GPUs suggests it can at least work. Why does it have issues with some AMD and all Intel Arc GPUs, even those that support DSC and DP1.4? Probably because, like the older GTX 10-series (I think that's correct), the DSC support is at least partially broken.

Theoretically, yes, this gets "solved" if you have a native DP2.1 monitor. Except, to do more than 4K at 240Hz, you again need functional DSC support. Which means if you have a DP2.1 monitor that's backward compatible with DP1.4 plus DSC, maybe you can get the best of both worlds and have non-DSC 4K 240Hz on AMD, and 4K with DSC on Nvidia. Intel would also need DSC still to do 4K 240Hz, though, because even 8-bit color at 4K 240Hz needs 60Gbps of bandwidth, and the Arc cards only support 54Gbps.

TL;DR: DSC support generally makes up for not being native DP2.1 on Nvidia, and apparently non-working DSC (or partially working at least) means AMD and Intel GPUs may be more problematic at high resolutions and high refresh rates.
 

ManDaddio

Honorable
Oct 23, 2019
121
64
10,660
The sooner they come out with the ti, titan, or both the better. Waiting as long as they did for the 3090ti was a mistake although the crypto craze and the virus were likely responsible for that.
AMD has nothing. Drivers won't make their cards competitive at the top end unless they want to cram a lot more chiplets on a new GPU and saturate that with ridiculous amounts of power.

Chiplets on a GPU do not act the same as a CPU. One size doesn't fit all. Nor one method.
Chiplets we're not necessarily introduced to make something stronger or more powerful. They were made for efficiency and cost savings.
As we see already there is a limit to shrinking an architecture. Shrinking memory
As you get smaller chiplets may not work as well as everyone wants to believe. Hybrid approaches are likely to still work better.
Nvidia is a master at graphics and AI hardware. If chiplets would have somehow made everything perfect they would have done that already.

Plus hardware solutions are always better when doing specialized operations using Ai, RT and other graphical things. That costs more and forces a design to fit that bill. Monolithic as everyone likes to say is working as of now.
The 5000 series will bring a newer design I think. But still be a hybrid of solutions to make everything work well and perform optimally. Nvidia always prides itself on reliability. That's why people pay more for what seems to be expensive.
All my Nvidia GPUs are going strong all the way back to the 1000 series. I gave away my 700 series GPUs which worked well up to a couple years ago. I still even have an old 650ti Boost running in my media computer. No problems with drivers or anything.

Sorry for any bad grammar. Writing fast. 😁👍
 
  • Like
Reactions: TR0767