Does Nvidia use defective chips for its GPUs that have disabled SMXs?

Status
Not open for further replies.

OWEN10578

Honorable
May 10, 2013
169
0
10,680
Just out of curiousity. For example, the GTX 760 has 2 disabled SMXs, so does the chips in the GTX 760 has 2 defective SMXs or Nvidia just disabled a perfectly good chip? Same question with the GTX 670, GTX 660 ti, GTX 650 ti, etc.
 
This is from Tom's review of the new GeForce 760:
"Nvidia enables six of GK104’s eight Streaming Multiprocessors across three or four of its Graphics Processing Clusters. This is similar to the approach taken on GeForce GTX 780, equipped with a trimmed GK110 GPU. In essence, the company doesn’t always know which of its chips’ resources are going to be defective. So, it can turn off two SMXes in one GPC or one SMX in two different GPCs. "
 


I can see that. So what I got out of the article is that Nvidia does use defective chips to make lower model cards. They turn of the defective SMX's and sell it at a lower cost. It's a smart move for the company, less goes to waste.
 
This is called binning and is not specific to Nvidia.

But yes sometimes when yields are too high, manufacturers will disable higher quality chips to meet the demand for a lower quality one.

An example is that a while back AMD made a bunch of 6970s. A lot of the 6970s were fine but they needed to produce more 6950s so they disabled the extra shaders through BIOS for the 6970s so that they could sell them as 6950s.
 


And whilst it made a good selling point they then nobbled it later by laser cutting the the extra shaders so the cards couldn't be unlocked to full 6970's but people still bought them in the vain hope that they could be.
 


Yeah it is kind of funny because those reference 6950s sold like crazy so I'm sure they were raking in the dough.
 

It was a rather slick albeit underhand move that went over really well with the AMD faithful although I can't help but wonder what kind of slagging off those same people would have given Nvidia had they done the same.
 


Yes, that's right! I was reading the article when the question came into my mind.
 


Yes I've read that, and that's the reason this thread exists.
 
Boy does it make no sense to deliberately cripple a perfectly good chip.
If they can be sold at the lower price, then they should be. It just means they are overpriced.

But it is true that a good portion of lower end cards may actually be partially defective higher end card. If the bad part can be safely isolated, why not sell it for a discount?
 


So Nvidia does use defective chips, but has Nvidia done something like you said AMD did with the 6950?

 


+1 to that. If they're not moving higher end models, why not drop prices? Even if just temporarily.
 
Evil cooperations... Evil cooperations everywhere! It's just plain business optimization. If you can't guarantee af full batch of the high end card (which you can't) it makes perfectly sense to make them in such way that you can guarantee a full batch of the lower end cards by some simple modifications of the defective high end cards.
In fact it explains (some) the "overprice" since they have to cover the costs for making a high end cards but selling some of them as low ends.
 
Yup, and this kind of thing is perfectly normal.
Back when AMD were selling Tri-core CPU's, if a Quad core went bad all they did was disable the defective core and sell it off as a Tri-core. In some cases they just disabled the fourth core and sold it off, which meant enterprising individuals could just re-enable the core and basically get themselves a quad core at tri-core prices.
Nvidia did this with the Titan actually, the chips used in them are basically chips that weren't suitable for use in their Tesla cards so they went to use in the Titans.

 
If its not defective, why lower the price on it at all?
And also the benefit of this whole process isnt that they can cripple their strong cards to make weaker ones, but rather they can build their whole range from the same components. Nvidia don't manufacture 770's and if they happen to be defective sell them off as a 760, they manufacture GK104 GPU's and then the 770/760 PCB's to put them in.

If a chip is good, use it as a 770, if its bad use it as a 760. If its good but your selling more 760's than you are 770's, or it just so happens that you don't have any defective chips to use, then just disable the cores.
 


Exactly my point. You just used the example of tri-core Phenoms unlockable to quad-cores. Why were they not sold as quad-cores? And if the quad-cores weren't selling so well, why not drop their price to a price still higher than the tri-core Phenoms? I understand disabling defective cores, but why disable functional cores?
 
Ahh, I see where your coming from now.
The reason Quad cores were disabled to Tri-Cores is because the Tri-Cores were selling more. From AMD's perspective (and would be the same for Nvidia), it costs the same to manufacture each CPU. Once its made, their goal is too get it out the door in any way they can, their not going to stop people from buying their hot product because they don't have any Tri-Cores while sitting on their inflating pile of Quad Cores.

Ultimately I don't think its AMD that set market prices, they set MSRP's but its really the retailers that decide.
And even if they did have complete control over how much they sell for, they couldn't lower prices that much. A $20 difference isn't going too sway anyone when your talking CPU's and GPU's, that's why the price tiers are in $50 and $100 increments. They couldn't drop the price significantly enough without impacting on the sales of their lower end chips (and worsening the value of the higher end ones) which in this scenario are your hot product and a way to recoup on manufacturing losses.

Business and Economics, messy things.
 
Agreed! I was thinking it's as simple as "tri-cores are selling more purely because they're cheap, so simply make quad-cores almost as cheap, even if just temporarily" but I suppose the way people make purchasing decisions isn't always that straightforward. If advice has been going around telling people to buy tri-core because it's more cost effective then I suppose many people will do exactly that without considering the specific prices at time of purchase.
 


The GTX465 springs to mind, I think some of them were able to be BIOS flashed to 470's but it was very hit or miss as to whether it worked or not IIRC.
 




So if im lucky i could flash a gtx 760 to a 780? Or a 670 to a 680?
 
The 670 and 680 are already pretty much the same card, just the 680 has some more CUDA cores. If you clock a 670 to 680 speeds, you will get 680 performance in games.

760 to 780, nope. Entirely different GPU's, memory bus and VRAM in them. For flashing to work, the hardware needs to be identical. Thats why 6950's could be flashed into 6970's, they were the same thing except some shader cores were disabled. AMD then later started cutting the shader cores off the 6950's, then suddenly people couldn't flash the cards anymore.
 
Status
Not open for further replies.