Info 4090ti cancelled.

Math Geek

Titan
Ambassador
with the 4090 hitting close to 600w, i can't even imagine what a 4090 ti would use.

it's just pushing the limits of physics to expect a few small fans to dissipate so much heat. even a 5 lb 2 foot long heatsink/fan combo is not enough it seems. all the little parts are just not made for such heat generation. a custom waterblock seems the only way if they insist of making such power hogs. at least get the heat away from the pcb asap
 
  • Like
Reactions: Exploding PSU
4090 already burns the connector
I don't know what magic the TI uses to be better than a GPU thats already using almost max
I recently touched on this topic in a previous post.
There's now so much power going through these cables that the actual connection points in the receptacle and bends in the cables make a difference. I believe the 12VHPWR spec does list minimum clearance for cable bends and connectors but uninteresting news like that usually gets buried. Well, now that cables and connectors are literally melting, it's got everyone's attention. This could also be mfgs. (including NVIDIA) cutting corners on wire and connector specifications. ;)



I'd guess that the 4090 Ti was going upwards of 800W, which is even beyond the new 12VHPWR spec, with only one 16-pin supplying 600W and the PCIe slot giving 75W.
 
  • Like
Reactions: PEnns

Eximo

Titan
Ambassador
AMD have stated that their next big card will have a high power draw. Unavoidable if they want to stay competitive. Since they are both using TSMC they don't have a lot to differentiate on. Basically comes down to architecture efficiency and size. Since AMD is going chiplet, they can always win the transistor count game.

Winter is here for me, so I think I will my GPU output alone, but it certainly makes sense to lower the power limit in the summer months. Been a while since I have had a 500W computer, now I remember why I liked my GTX1080 so much.
 

Zerk2012

Titan
Ambassador
The rumours about 7000 series seem to indicate they aren't messing with power cables. Or I wouldn't be looking at the new ones and just get the current generation

They would be smart to avoid similar.
Yep from this either 2 or 3 X 8 pin that is a great selling point.
I have used several AMD cards with good performance and never had the dreaded driver issues everybody always talked about all the way back to Xfire 6970's
 
If AMD can swoop in and be competitive, performance-wise, all while sticking to 3x8-pin for power for their top-tier card, then that's definitely a win for AMD.

3 things AMD needs to greatly improve -
Ray Tracing. They need to catch up to NVIDIA in ray tracing performance with either this, or next gen.
Streaming/recording. They really need to put some extra focus on content creators and those who just like to stream/capture their gaming.
FidelityFX Super Resolution (equiv. DLSS) - Upscaling/superscaling needs to get as good as DLSS. I don't like the direction of DLSS 3.0 (generated frames) as I'm very sensitive to artifacting in games. This is the chance for AMD to really catch up.
 

Colif

Win 11 Master
Moderator
I rarely see AMD drivers causing problems. There is the factor of rarely seeing them compared to Nvidia, but they not bad.
Not sure I like how they have legacy drivers but I guess it beats trying to get all your cards from last 10 years to work with same drivers... Nvidia does that.
 
  • Like
Reactions: alceryes

g-unit1111

Titan
Moderator
AMD have stated that their next big card will have a high power draw. Unavoidable if they want to stay competitive. Since they are both using TSMC they don't have a lot to differentiate on. Basically comes down to architecture efficiency and size. Since AMD is going chiplet, they can always win the transistor count game.

Winter is here for me, so I think I will my GPU output alone, but it certainly makes sense to lower the power limit in the summer months. Been a while since I have had a 500W computer, now I remember why I liked my GTX1080 so much.

Oh man really? This is getting to be quite old. I need to get a new GPU to replace my old 1080, but I think a 3070 is as high as I will go. The power draw on new GPUs is insane.
 

Eximo

Titan
Ambassador
Yep, 450W or so for the 7900XT. So like 500W after a little tweaking from the AIBs. That information came straight from AMD, but they weren't specific, just saying that this is the only way to improve performance over last generation significantly.

Unless they lean hard into being the power efficiency leaders and lock them down artificially.

They are already at 335W for the 6950XT, which is basically the 3080/3080Ti power range. Only these silly 4090 and 3090Ti that are really pushing things to the extreme.
 
I think that is their intention, yes. But the AIB partners might have their own ideas. I can see the argument either way. Standardize to the high end crowd using the new 16-pin or stick with the old school and use triple 8-pin.
Whenever new specifications come along there's always a generation that has to try and straddle it. I'm all for AMD sticking with the old ATX 2.5 spec for the RX 7000 cards.
Most, if not all, PSU mfgs. will continue to ship the old 8-pin cables for a while as it's only a very small fraction of a percentage of users who will require the new 12VHPWR cable. Also, at least for the 3090, 3090 Ti, and 4090, GPU mfgs. are including the required adapter cable with every GPU shipped.

Since the 7000-series is the one 'straddling' the new standard, the top tier of AMD's next gen will certainly use the new 12VHPWR connector. But that's, for another day. :D
 

Phaaze88

Titan
Ambassador
"Cancelled - for now.", I see. So, 4090 Super about a year later when they get it right?

The 12VHPWR connector looks more like a PEBCAK issue to me. It is preventable.
"But muh aesthetics!"

Some are hoping AMD competes this time as well, in the hopes of getting cheaper Nvidia gpus, which is pretty F'd up.

3 things AMD needs to greatly improve -
Ray Tracing. They need to catch up to NVIDIA in ray tracing performance with either this, or next gen.
Streaming/recording. They really need to put some extra focus on content creators and those who just like to stream/capture their gaming.
FidelityFX Super Resolution (equiv. DLSS) - Upscaling/superscaling needs to get as good as DLSS. I don't like the direction of DLSS 3.0 (generated frames) as I'm very sensitive to artifacting in games. This is the chance for AMD to really catch up.
The first one I still don't care about - if I'm immersed in a game(via story or game mechanics), I'm not paying attention to details like that anyway. Adoption rate is still doo-doo.
#2 doesn't apply to me, but I still understand the concerns there. There's also applications that require CUDA. [These areas are most likely AMD's largest shortcomings with their gpus, IMO.]
As for the 3rd, just native all the way. Until it comes natively like PhysX or whatever, it isn't much different than #1 to me.
 
  • Like
Reactions: alceryes
The first one I still don't care about - if I'm immersed in a game(via story or game mechanics), I'm not paying attention to details like that anyway. Adoption rate is still doo-doo.
#2 doesn't apply to me, but I still understand the concerns there. There's also applications that require CUDA. [These areas are most likely AMD's largest shortcomings with their gpus, IMO.]
As for the 3rd, just native all the way. Until it comes natively like PhysX or whatever, it isn't much different than #1 to me.
I was making more of a blanket statement for AMD - not necessarily what I want to see. Although, I do think RT is going to become more and more popular and mainstream over the next 5 years of AAA games. I agree with you on immersive stories/mechanics, my favorite genre is open-world RPG, but spectacular RT at acceptable FPS isn't a negative if you can throw that in too. ;)
Number 2 doesn't apply to me either I just know it's a pain point for many. These are the people who can literally sell an extra 5,000 units just by mentioning that, "Wow! AMD has really improved their streaming capabilities!" It's more of a plus for AMD.
I agree on native frames. That's the way to go if you're getting good FPS. Again, it will help AMD more if they can truly get FSR as good as DLSS 2 (and NOT go the fake frame DLSS 3 route).