Report: Nvidia GTX 480 to Have Disabled Cores

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]SchizoFrog[/nom]PMSL at the hypocrasy of the AMD/ATi fanboys...Why use GPUs with disabled cores? I'll just say... Phenom x3.[/citation]
Why not add all these to the list?
Kuma CPUs, Phenom II X2, Phenom II X3, Athlon II X2, and Athlon II X3
While you're at it, add the HD4830, 9600GSO (along with numerous other G92 variations), and my aforementioned GTX260.

This sort of thing should be of no shock to anyone who's paid attention over the last few years. It simplifies production tremendously. You design one chip, and one chip only. You take the ones that work perfectly and sell them properly as the flagship. You take the ones that are slightly imperfect, disable the imperfections, then sell them as lesser models.

The problem with nVidia's Fermi is that it doesn't seem as though they aren't getting any "perfect" yields, yet. Though, I can't imagine a card that has 32 of it's 512 cores disabled, yet already draws almost 300W of power remaining within the PCIe 300W standard when all 512 cores are enabled. Even if it barely manages to slip within the limit, there would be zero overclocking headroom.
 
Disabling cores makes no sense to me, I don't mind it, it just sounds stupid.
 
[citation][nom]jimishtar[/nom]I wouldnt buy 5850 nor gtx 480 or amd x3. i dont care about brand/manufacturer, what I care about is a 100% healthy, 100% able hardware. its like buying a v6 car with 2 disabled pistons. just doesnt sound right, does it ?[/citation]

By your logical virtually every CPU is defective because only few are binned as being EE. You go on paying $1000 for CPUs while I pay for $200, overclock and get equal or better performance.
 
[citation][nom]victomofreality[/nom]Stop fluffing Fermi! I can't think of anything else that's had this many articles without any actual substance! when the benches come out then I'll be interested, and if theres benchmarks with the disabled cores unlocked I'll be extremely interested, till then give me a break please.[/citation]
duke nukem
 
NVidia certainly has every right to release defective GTX 480/470's. What causes concern is where are the fully functional 512SP parts? ATI release fully working 5870 and cut down SKU's 5850's. However with NVidia all we're seeing are defective 480's and 470's being passed off as flagship parts. Are you guys really buying this BS?
 
Everyone, calm down. This same rumor has cropped up several timnes over the past few months, and nV has NOT stated that the 480 will have any fewer than 512 cores.
 
You'll never get equal or better performance from a $200 processor vs a $1000 processor, and if you think you can then please post your oc so I can see it because I doubt your going to squeeze 5ghz out of a i7 920. And that was done back in 08 on an i7 965. Those $1000 cpu's your talking about have more potential than ever before weather they be intel or amd so your $200 oc is nothing more than a stable processor with headroom to overclock, you can only squeeze so much performance out of any processor so lower end = lower yields, upper end = higher yields. True enthusiast buy the best hardware to stay competitive and so they can tweak for maximum performance/processing power and bragging rights. I personally prefer to not upgrade my pc every year or 2 and so I tend to spend a large amount of money every 5 or so years depending on technology and how much faster the newer hardware is. Nothing against overclocking lower end stuff, but oc a i7 980 vs a i5 or even an i7 920 and you'll never be able to come close to the 980 in terms of stable performance and processing capability at the same speeds (you can thank intel or qpi for that). But then again I do love a good oc.

As far as the videocard issues Nvidia will remain solid in it's price/performance even after disabling cores like all the manufacturers have been doing for years. Objectively I always go with performance spec's when I decide on my pc purchases so more than likely at the rate it's going I will retire my old 9800gtx's this year depending on how much faster the benchmarks are on the newest cards out.
 
Wow, another rumor. I would like to suspend my judgment on this matter until Nvidia release these cards and reviewed. Whether it would have disabled cores or not, actual performance would still be the basis if this card will suck or not. IMO, Toms could compress these Nvidia Fermi news in a single article instead of some pieces of small tidbits.
 
Didnt we see a screen cap from someone in austrailia that showed the 512 cores already? Might have been an engineering sample but still there are 512 core versions out there. Maybe they disabled some of the cores to cut down on power consumption so it doesnt suck up 295w on 1 gpu. Maybe this is all BS (barbera streisand) and rumor spread by amd fan sites like smear-accurate.

Heres a pic from someone in asia some where (note the asian language characters on the left hand window)

http://i43.tinypic.com/14lsln7.jpg

 
I'm no fan of nvidia products (mainly because it's beyond my budget), but damn. Hurry up and make something happen to lower 5870 prices so I can buy one already. I'm cheering you on, don't disappoint me!
 
I'm no fan of nvidia products (mainly because it's beyond my budget), but damn. Hurry up and make something happen to lower 5870 prices so I can buy one already. I'm cheering you on, don't disappoint me!

What if they're the same price and the amd card doesnt come down in the first couple of months? What if amd releases a 5990 or a 5890?
 
IMO: technology comes in 2 main factor Architecture and Manufacturing
NV play a lot on Architecture while they sacrifice the manufacturing process, this is why Fermi late so long and cost more than 5xxx
Architecture prove to give better performance but using old manufacturing process will cost high price to end user
Nowdays graphic card such 5850 deliver great performance even on 1920 res and how many of the gamers could afford such 30 inch monitor as so they require to buy such 5970 or GTX 480.
How much performace actually we need to be satisfied by graphic card ?
Well I don't really bother whether the new architecture from fermi or maybe 6xxx series as long I can play with the latest dx ver with descent framerate in my monitor.
 
[citation][nom]builderbobftw[/nom]Like the 5850?[/citation]

Yeah, but not like 5870. Top model should have all the cores enabled. Cutting down for lower end models is fine.
 
[citation][nom]manitoublack[/nom]It's odd that NVidia is having trouble with 40nm yields, since both AMD and Nvidia Chips a MADE IN THE SAME BUILDING at Taiwanese Semiconductor Manufacturing Company (TSMC.) Maybe Nvidia should contract intel which seems to be having no such issues at 32nm...[/citation]
[citation][nom]brendano257[/nom]The TSMC yield issues were also apparent in the earlier Radeon's, I suggest you do more reading. They most likely did a similar thing to their cards to correct the issues, or worked something out with TSMC, that I'm not sure about.[/citation]

Read up:
http://www.anandtech.com/video/showdoc.aspx?i=3740&p=1

To us common folk, we think that designing a chip is a straightforward process, just like how you build a building. Create blueprints, then get materials, then build house.

Unfortunately for the chip industry, you could design all you want, but the chip production facilities have guidelines and limitations on what you could draw-up.

Based on the anand article, nvidia is failing now because they transitioned to 40nm a little bit too late. ATi already had a 40nm part long ago (4770), so they had experience creating chips with the 40nm process.
 
I've known about the issue of less cores for a while...its not a peice of info that's suddenly appeared, that's been out there for the last 3 weeks! Come on Tomshardware, get your arse into gear!
 
[citation][nom]myocingskills[/nom]You'll never get equal or better performance from a $200 processor vs a $1000 processor, and if you think you can then please post your oc so I can see it because I doubt your going to squeeze 5ghz out of a i7 920. And that was done back in 08 on an i7 965. Those $1000 cpu's your talking about have more potential than ever before weather they be intel or amd so your $200 oc is nothing more than a stable processor with headroom to overclock, you can only squeeze so much performance out of any processor so lower end = lower yields, upper end = higher yields. True enthusiast buy the best hardware to stay competitive and so they can tweak for maximum performance/processing power and bragging rights. I personally prefer to not upgrade my pc every year or 2 and so I tend to spend a large amount of money every 5 or so years depending on technology and how much faster the newer hardware is. Nothing against overclocking lower end stuff, but oc a i7 980 vs a i5 or even an i7 920 and you'll never be able to come close to the 980 in terms of stable performance and processing capability at the same speeds (you can thank intel or qpi for that). But then again I do love a good oc. As far as the videocard issues Nvidia will remain solid in it's price/performance even after disabling cores like all the manufacturers have been doing for years. Objectively I always go with performance spec's when I decide on my pc purchases so more than likely at the rate it's going I will retire my old 9800gtx's this year depending on how much faster the benchmarks are on the newest cards out.[/citation]
You really are an idiot. THEY ARE THE SAME CHIP!!! Ever head of speed binning? As in a 3.3 GHz proccy being sold as a 2.4 proccy, cause few people buy 3.3's? It is because of this fact that OC'ing is even possible. And no i7 will get to 5 GHz unless you are willing to use liquid nitrogen or something. They top out at the 4.2 mark. ALL lynnfield/clarkdale/nehalem processors. Yes, even the $200 i5 750 will OC as hard as an i7 975. Most "enthusiast" hardware is simply put a waste of money.
 
well, the problem with the fermi cores/sections unlike amds x2/x3 variants is that amd disables cores to fill market price segments, nvidia is disabling sections of the fermi gpu because its running extremely hot or soaking up too much power, or its actually faulty.
 
[citation][nom]mousemonkey[/nom]What like a 5830?Do you like the idea of a 5830 then as it's kind of the same thing?[/citation]

And the 5850 as well... so they are saying the only card we should buy is the 5870... haha okay.
 
Status
Not open for further replies.