I recently touched on this topic in a previous post.4090 already burns the connector
I don't know what magic the TI uses to be better than a GPU thats already using almost max
Yep from this either 2 or 3 X 8 pin that is a great selling point.The rumours about 7000 series seem to indicate they aren't messing with power cables. Or I wouldn't be looking at the new ones and just get the current generation
They would be smart to avoid similar.
That is the way it reads.I guess the 4090ti was on fire. Literally….
Same. That old trope has been dead for 15+ years.I rarely see AMD drivers causing problems.
AMD have stated that their next big card will have a high power draw. Unavoidable if they want to stay competitive. Since they are both using TSMC they don't have a lot to differentiate on. Basically comes down to architecture efficiency and size. Since AMD is going chiplet, they can always win the transistor count game.
Winter is here for me, so I think I will my GPU output alone, but it certainly makes sense to lower the power limit in the summer months. Been a while since I have had a 500W computer, now I remember why I liked my GTX1080 so much.
This can still be done with 3x8-pin. Hopefully they don't go much higher than 500W.Yep, 450W or so for the 7900XT. So like 500W after a little tweaking from the AIBs.
This can still be done with 3x8-pin. Hopefully they don't go much higher than 500W.
Whenever new specifications come along there's always a generation that has to try and straddle it. I'm all for AMD sticking with the old ATX 2.5 spec for the RX 7000 cards.I think that is their intention, yes. But the AIB partners might have their own ideas. I can see the argument either way. Standardize to the high end crowd using the new 16-pin or stick with the old school and use triple 8-pin.
The first one I still don't care about - if I'm immersed in a game(via story or game mechanics), I'm not paying attention to details like that anyway. Adoption rate is still doo-doo.3 things AMD needs to greatly improve -
Ray Tracing. They need to catch up to NVIDIA in ray tracing performance with either this, or next gen.
Streaming/recording. They really need to put some extra focus on content creators and those who just like to stream/capture their gaming.
FidelityFX Super Resolution (equiv. DLSS) - Upscaling/superscaling needs to get as good as DLSS. I don't like the direction of DLSS 3.0 (generated frames) as I'm very sensitive to artifacting in games. This is the chance for AMD to really catch up.
I was making more of a blanket statement for AMD - not necessarily what I want to see. Although, I do think RT is going to become more and more popular and mainstream over the next 5 years of AAA games. I agree with you on immersive stories/mechanics, my favorite genre is open-world RPG, but spectacular RT at acceptable FPS isn't a negative if you can throw that in too.The first one I still don't care about - if I'm immersed in a game(via story or game mechanics), I'm not paying attention to details like that anyway. Adoption rate is still doo-doo.
#2 doesn't apply to me, but I still understand the concerns there. There's also applications that require CUDA. [These areas are most likely AMD's largest shortcomings with their gpus, IMO.]
As for the 3rd, just native all the way. Until it comes natively like PhysX or whatever, it isn't much different than #1 to me.