Nvidia uses AD102-301 GPU for new GeForce RTX 4090 Founders Edition boards.
Nvidia Quietly Rolls-Out GeForce RTX 4090 with New Die : Read more
Nvidia Quietly Rolls-Out GeForce RTX 4090 with New Die : Read more
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
I doubt Nvida would spend the millions of dollars needed to re-tapeout an IC only to integrate ~$1 worth of external circuitry. There must be more to it. Having a new chip and board ID to drop max Vcore by 30mV seems excessive, unless Nvidia expects RMA liabilities to exceed costs or has other things wrapped into that presumed re-spin too.
Doesn't have to be something 'major' as that would definitely have to make it into notices somewhere. Could be more benign stuff like improved yields, such as the new chips being able to hit the same performance with 30mV less Vcore which could pave the way for a 4090 Ti/Super with the old 1.1V Vcore and 10% higher stock clocks.Exactly this. They found some kind of defect, vulnerability, or something major. If it was only the BOM they would simply have done a super refresh with said changes.
Chips launch with defects all of the time, many of which don't get in-silicon fixes until generations later if ever, which is why we get firmware, driver, OS, application and compiler-level work-arounds.And we're talking about a company that's dedicated to video graphics. It is their specialty. It's unlikely they're going to have a defect.
I was going to comment something very in-line with yours, haha. Like... "Look, the beach has one less grain on sand in it!".Whatever the reason for the revision, the high cost of 4090 means that, for the overwhelming majority of GPU buyers, the change is irrelevant.
If you don't mind sacrificing two GPUs for science to strip their dies down to signal layers.more. Maybe someone can check the naked dies and compare if there's more changes than just power regulation?
My opinion: I highly doubt it was because of a defect. There's revisions made all the time to gpus. AMD does the same thing. Imagine doing years of testing of a GPU (not a simple device) and then you release it with a defect.
And don't bring up memory and capacitors. Sometimes that's just on the other end with bad quality control.
The GPU itself would be working most likely near perfection before they would consider releasing it to the public.
And we're talking about a company that's dedicated to video graphics. It is their specialty. It's unlikely they're going to have a defect. At least to make a change like this.
It could just simply mean they're using different parts now from different manufacturers and had to make some slight changes to accommodate.
It's funny how people just default to "it must be a defect".
Only an obvious AMD shill would say that. But if I'm wrong then I'm not sure what kind of mind you have.
Simply saying something without any kind of verification is kind of misinformation.
NVIDIA would never release defective product then not back it up...like defective nFORCE motherboard chipsets...oh wait....or bad memory on 2080ti's...cringe Hold on hold on wait, they make great laptop chipsets that don't...da*n.....My opinion: I highly doubt it was because of a defect. There's revisions made all the time to gpus. AMD does the same thing. Imagine doing years of testing of a GPU (not a simple device) and then you release it with a defect.
And don't bring up memory and capacitors. Sometimes that's just on the other end with bad quality control.
The GPU itself would be working most likely near perfection before they would consider releasing it to the public.
And we're talking about a company that's dedicated to video graphics. It is their specialty. It's unlikely they're going to have a defect. At least to make a change like this.
It could just simply mean they're using different parts now from different manufacturers and had to make some slight changes to accommodate.
It's funny how people just default to "it must be a defect".
Only an obvious AMD shill would say that. But if I'm wrong then I'm not sure what kind of mind you have.
Simply saying something without any kind of verification is kind of misinformation.
I doubt Nvida would spend the millions of dollars needed to re-tapeout an IC only to integrate ~$1 worth of external circuitry. There must be more to it. Having a new chip and board ID to drop max Vcore by 30mV seems excessive, unless Nvidia expects RMA liabilities to exceed costs or has other things wrapped into that presumed re-spin too.
Comparators are just about as "jellybeans" a part as you can possibly get, there usually are 10+ different sources of mostly interchangeable parts.If the $1 external circuit is difficult to source and holds production of a $1600 board hostage...
Are the design tools and their integration with the whole "fab system" making it easier/cheaper to make these incremental changes?
The current design can't either if you are not stupid enough to yank the connector halfway out at an angle. Get over it already, or should I start about the faulty vapor chambers any time I can, too?-30mv can't melt the connector 😉