Report: Nvidia GTX 480 to Have Disabled Cores

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I have to agree with a few posters above me, if AMD can cripple a card why can't NV do the same?

I don't understand why all the AMD fanboys post on NV threads just to brag about how good their 5 series is doing. This thread is not about the 5 series, if you are in love with your AMD GPU then good for you, but don't dogg down a GPU that has not been released yet just for the heck of it.. IMO you make AMD look bad, specially when you have no clue as to what you are talking about..
 
[citation][nom]teeth_03[/nom]I think your getting AMD confused with VIA. I'm not an expert, but I would think GlobalFoundries does AMDs stuff...[/citation]
No, GlobalFoundries does AMD's CPUs. As of now, they send their GPU's off to be made at TSMC
 
[citation][nom]Ciuy[/nom]nvidia following amd`s core reenabling technique ??? )[/citation]

you are absolutely right, it`s sad/funny to see people buy the same CPU`s for 1000$ and how they think they made a great deal 😉
 
[citation][nom]madass[/nom]You really are an idiot. THEY ARE THE SAME CHIP!!! Ever head of speed binning? As in a 3.3 GHz proccy being sold as a 2.4 proccy, cause few people buy 3.3's? It is because of this fact that OC'ing is even possible. And no i7 will get to 5 GHz unless you are willing to use liquid nitrogen or something. They top out at the 4.2 mark. ALL lynnfield/clarkdale/nehalem processors. Yes, even the $200 i5 750 will OC as hard as an i7 975. Most "enthusiast" hardware is simply put a waste of money.[/citation]

you are absolutely right, it`s sad/funny to see people buy (essentially) the same CPU`s for 1000$ and how they think they made a great deal 😉
 
my gtx 260 core 216 runs Starcraft II fine. spending $429.99-$599.99 on a video card doesn't make much sense. all this is mearely a bunch of babble. I'm more wondering how the card will perform, will it have issues I heard rumored of anything above a gtx 275? these cards use a lot of power so they will get hot.. I'm assuming that means that they may have heat issues. I'll just be waiting for another sub $200 card. my gtx 260 was nice price at 135 brand new.
 
HOPE THIS THING WORKS !!!!!!!!!!
is nvidia selling defective cards in the name of disabled cores reducing price and all that marketing stuff whatever!!!!!!!!!!
but 2 say that nvidia is surely doing compromise with it's quality as price concerns but similarly ATI is also going make some joke nowonwards

as 480 gtx disabling and we genius enabling with bios as example
 
If lots of cards have a couple of defective cores, then I think it is a good idea to remove a few cores and get the card to the market now.

Core reduction is 6,25% and performance will drop just a few percent. The 100% working chips can be sold as tesla cards for a higher price.

Would you guys prefer to wait several months for better availability of 100% chips? Then I guess everybody will be talking about Radeon HD6000 series. Nvidia needs competive cards on the market asap.
 
the 'F' in Fermi stands for 'fail'
its not enough that the cards have ridiculously higher power consumption than cypress, they go ahead to have some cores axed from the cards!
How the hell do they expect to be competitive against ATI at this rate.
I have been counting on nvidia to help bring down 5k series card prices from ATI, but it seems it will be a longer wait for Nvidia to come up with something that will give ATI a run for their money.
 
Unlocking the defective or deactivated cores on the Furby cards could cause problems.

1 - You'd need to hack or use a hacked bios
2 - If the cores are defective, the card would fail until re-done.
3 - If working, the card will also generate more heat and may have more power issues.

The issues with the GF 480 are long known. Too hot, not much performance. The loss of 32cores allows Nvidia to have actual products to sell.

Someone posted, why is Nvidia having problems since ATI's chips are made in the same factories.

TSMC manufacture the chips based on the designs that ATI, NVidia and other customers hand over to them. Remember the failure of Nvidia's G92/G94 chips in notebooks and graphics cards (discrete video cards have better cooling and thermal control) which meant lawsuits have been flying for the past 2 years?

GeForce 480 / 470 ("GTX" is pointless. Are there any GTX220 or GT285?) is different from ATI by:
1 - experince, ATI's 4770 showed ATI how to better handle 40nm design & manufacture.
2 - GF 480 Dies are HUGE. They didn't learn from the GF100 chips(whatever) that were used in the GeForce 260~295s... but then again, GPU design takes about 3 years! The ATI 4800s kicked Nvidia in the nuts... it wasn't quite as fast in some games but it was a WHOLE lot cheaper.

If GPU A is twice the size ^ over GPU B, that means 4 GPU Bs can fit into a single GPU A. The making of the actual chips are on the same size manufacturing processes. So it costs the same to make either design A or B on a wafer, but the difference is that design B gets 4 chips out of every 1 from design A.

3 - Size again, Wafers are typically 300mm across (going by memory). Any chips on the edges are useless. Smaller the squares, the more you'll get near the edges.

Simple example: If a typical wafer has 15 defects. It means bigger chips such as those in the GF 480 are rendered useless or put in a lower bin. Lets say CHIP A (GF 480) is 529mm^2 vs 5870 at 334mm^2 chip B... almost double the size!

The Fermi/GF100 can fit up to 94 CPUs per wafer... but 100% yields isn't possible (Edges and defects)... Nvidia maybe getting 23~25 workable GPUs per wafer (This doesn't mean 23~25 GF-480s... just a working chip. So some of these could be binned for a GF-460. AMD is able to get about 4x as many dies per wafer and the defects don't hurt as much. AMD is able to get a lot more GPUs out per wafer, perhaps close to 100 5870 GPUs. (5770/5750 GPUs = 3x as many)

Keep in mind that each wafer costs about $5000 to make.

Here is a good drawing of what GF-100s would look like on a wafer.
http://www.brightsideofnews.com/print/2010/1/21/nvidia-gf100-fermi-silicon-cost-analysis.aspx

4 - Bigger DIE (CHIP) = more heat... more power requirements. A single GF 480 will use about the same power as a dual-GPU ATI 5970... ouch. With all the heat and power issues, over-clocking this baby is going to be minimal... and problematic.

5 - Scaling down the GF100/FERMI... not looking so good either. So it would be a long time before Nvidia has $50 DX11 parts. Yeah, I know - they are weak.

The G92/94/G215 will be in production a long time. Yes, Nvidia did an excellent job with the G92/8800GT~9800GTX~gts250... the card that keeps on going.
 
The "GTX 480 will outnumber the 5870 2:1" (not exact quote).

That's funny! Comparing a card that has been shipping for 6 months to another card which hasn't sold one yet... since its not out yet.

Der... disabling the cores and bringing the performance down to 5870 levels... ATI will always win since they have cost on their side. Oh yeah, they have low-yeld cards too... the 5830... which will be really nice ONCE it starts selling for $170~190.

So if or when the 512 core furbi comes out, that will only add another 5~7% performance boost. Perhaps the GeForce 485~490 that comes out 6 months later with higher yields and higher clock rate & more cores will be able to squeeze another 10~15% performance out of todays GF-480... which we'll know in a few days if its actually 0~5% faster than a 5870.

Wouldn't be surprised if a 5880 or 5890 comes out sometime in April~May that just out-runs the GF-480. All it would need to be is about 20% faster than a stock 5870.

Also... 7~8 months from now, if on time... the ATI 6800 series ships. If all Nvidia has is the GTX 480, 470, GTS 320 and whatever re-badged 8800/9800 card into the "3" series... nvidia will be in a mess.

Seriously... I bet the gts250 will get a speed bump and be called the gts 350... just like the 220 > 320, but there was no difference between those.

 
does anyone really still hold any sort of hope for these cards anymore? every single announcement/news has been disappointing. Performance, power consumption, price, of course the long ass time it took for them to get it out, and now this.

Guess this answers why the card took so long to get here, It's crippled!
 
[citation][nom]steiner666[/nom]does anyone really still hold any sort of hope for these cards anymore? every single announcement/news has been disappointing. Performance, power consumption, price, of course the long ass time it took for them to get it out, and now this.Guess this answers why the card took so long to get here, It's crippled![/citation]

I have to say that after following this now since oh Sept or Aug of 2009 I can concur with some other people that have said Nvidia was waaaay over optimistic on what they could do with the DESIGN of Fermi and now it seems like it is true. This Card has (by Nvidia's own admission ) been really F$&*en hard to make . I know there are people out there that don't believe or hate on Charlie but the truth is he has been right on about Fermi being really hard to make , too hot, and not really profitable. I think Nvidia should just bight the bullit and MOVE on to there NEXT chip , this incarnation of Fermi is not going to be good for them at all.
 
These cards 480gtx and 470gtx are power hungry and they run hot even with disabled cores, maybe this is the reason why those cores are disabled. If I would be at nvidia I would do the same. I would give out these cards and start to perfect my drivers for them; in the meantime I would perfect the architecture to get lower power consumption and to generate less heat, and then when AMD launches new cards, I would lunch my better single GPU and dual-GPU solutions probably named 485gtx and 495gtx.
 
Status
Not open for further replies.