Nvidia Briefly Makes Mention of Secret GTX 580

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Geeks3d managed to capture the image before it was taken down:
It's probably not the case here, but still, one could fake such an image using MS Paint. My point is: image means absolutely nothing.
 
Meh, don't really care anymore. Over the past year I've lost a lot of respect for them. The icing on the cake is charging people $40 to unlock their card's built in ability to work with 3D TVs. I don't have or particularly want a 3D TV, but the card can already do it, it should just work at the driver level. It's like Intel charging people to unlock tucked away cores on their processors.
 
[citation][nom]fjjb[/nom]i hope it doesnt incinerate itself from constant use[/citation]

Yes, true, the common joke of "The GTX is just SO DAMN HOT!" will carry over to the 580 if it is as / more hot / power hungry. But perhaps being GF110 they solved the heat issues, perhaps not, but wait with the jokes until we actually see some numbers. It's interesting to note that the early reports regarding the Radeon 6970 indicate a power requirement over 225 W. If it ends up being a hot / power hungry beast, will these kind of jokes persist?

[citation][nom]elel[/nom]Rebranding? Again? I'd be so shocked![/citation]

"Rebranding" is taking an old card and changing the name, and nothing else. This card has a different number of cores, making it a different card. Not rebranded.
 
[citation][nom]elel[/nom]Rebranding? Again? I'd be so shocked![/citation]
Lol doesn't look like people follow logic.
Perhaps one should shead lite that the 5870 5850 and 5830 are all the same exact chip except a 5850 couldn't meet the grade of a 5870 and a 5830 couldn't meet the grade of a 5850 and so they are uniformly crippled and sent out as different cards.

A Full GF100 featuring all 512 cores functioning would perform differently and likely feature different clocks. I can't see why people would be downing your comment when it's absolutely correct, it also assuming they the GF100 arch that the GF110 likely is based off of wasn't altered in any way in which case why would nvidia rename the chip that is uncharacteristic of them if they haven't altered a dam thing.

If you want to look at rebranding look at your little 5770 which is the most current card that will likely be rebranded.

Would you call a first gen G92 from a 8800GT = to a full GTS250 which is still a G92, i wouldn't they perform quite differently. Just because haters are gonna hate.
 
Uh oh! NOBODY BUY ANYTHING!

The GTX 580 Ultra is coming to come out any day now! To buy any ATI or perhaps even the 480 would be a waste of money...

Ohhhhh I also predict the Geforce 680 and the ATI 7850 cards... those are the real cards to wait for!
 
I don't know what is going on with Nvidia past year or so, they need to take their heads out of where the sun don't shine and get back to the top company they used to be, all they have done past couple of years is a lot of renaming, WTF.
 
So, re-branding to create a new series that is "newer" that just recycles old technology? Sounds like what Nvidia did for the GT 300 series. Given that the GT 100 series were retroactive rebadges as well, I'm partly inclined to think that Nvidia plans on making this a trend: actual GENERATIONS would only be the even-numbered ones, with subsequent odd-numbered series being re-badges of the prior proper series. So something like this:
- 100 - Re-badges introduced in 2009.
- 200 - Based around GT200 design starting 2009. (plus re-badges for the GTS and below range)
- 300 - Re-badges of G92 and low-end GPUs starting 2010.
- 400 - Based around GF100 (Fermi) design starting 2010.
- 500 - Re-badges of Fermi?
- 600 - Next actual series? (perhaps 1024 CUDA cores?)
[citation][nom]rmmil978[/nom]But perhaps being GF110 they solved the heat issues, perhaps not, but wait with the jokes until we actually see some numbers. It's interesting to note that the early reports regarding the Radeon 6970 indicate a power requirement over 225 W. If it ends up being a hot / power hungry beast, will these kind of jokes persist?[/citation]
225 watts would still be less than the GTX 480; it might be still disconcertingly high, but it's still less than the TDP numbers for both the GTX 280 and 480. And solving the heat problem would require more than just design work: it flat-out needs a die shrink. Given that TSMC (the company that fabricates ALL of Nvidia's GPUs) is still on 40nm, (the same process used for the GF 100) there's no chance of seeing this.

My guess is that GF110 is only a mild re-tooling of GF100; possibly reduced transistor count (and hence silicon area) due to possibly making some parts more streamlined, and possibly taking out some unused GPGPU-specific capability that only really sees use in Tesla cards. Chances are TDP will remain the same, as that extra 1/16th will be now in use, canceling out any (slight) gains from cutting down the circuits.

[citation][nom]rmmil978[/nom]"Rebranding" is taking an old card and changing the name, and nothing else. This card has a different number of cores, making it a different card. Not rebranded.[/citation]
Er, not quite. It's basically the same GPU, as the original Fermi GF-100 had a total of 512 cores, just with 1 out of 16 sections disabled on the GTX 480, bringing it to only 480. What's seen shows it's not a real new GPU design; a more proper name would be "GF100b" kinda like the change from G92 to G92b between the GeForce 8800GT and 9800GT.

Though, keep in mind that a different number of cores HAS been used for a re-release of the exact same model number. Remember the 8800 GTS? The first flavor, in G80, came with only 96 stream processors enabled, while the later G92 type (also named for its 512MB of RAM) upped that to 112, though both used cores that had a total of 128. It's a similar case here: GF100 actually DOES have 512 CUDA cores on it, just no card has them all in use, partly because yields are so bad.
 
[citation][nom]stm1185[/nom]Like anyone though they would not have a new high end card. Like anyone thought it wouldnt be called the GTX 580. How about some actual specs.[/citation]
Won't be there before 6 months. Remember the GTX400 series? Paper launched and faked in a conference in Q4 2009 but launched only at the end of Q1 2010.
 
[citation][nom]WarraWarra[/nom]Would be nice to see something new from NV as a lot of hardened ATI followers might be thinking "AMD twice" and might just switch to NV.[/citation]
In your dreams.
 
GF110 sounds like a GF100 that's 32nm instead of 40nm. The Geforce 3xx wasn't the the Fermi replacement of the Geforce 2xx, which first came as Geforce 4xx.
 
[citation][nom]liveonc[/nom]GF110 sounds like a GF100 that's 32nm instead of 40nm.[/citation]

Who's gonna fab it? Both TSMC and GloFo dropped 32nm bulk. The only 32nm production I know of outside Intel's fabs is GloFo's Super-High-Performance High-K Metal Gate process, used for next-gen AMD CPUs. Producing a 3 billion+ transistor GPU on that would be rather cost-prohibitive.
 
It's either a stupid error or (my money is on) it's a marketing ploy. Wave your hands, jump up and down...just do anything to distract from what your opponents are doing especially when they have something better. Pretty brilliant...they don't even have to do any work and the get free publicity. Then when it comes out they won't have anything for another 6 months they'll claim the 580 "leak" was a typo and that "we never said *anything* about it...it was you guys putting words in our mouth".
 
[citation][nom]dertechie[/nom]Who's gonna fab it? Both TSMC and GloFo dropped 32nm bulk. The only 32nm production I know of outside Intel's fabs is GloFo's Super-High-Performance High-K Metal Gate process, used for next-gen AMD CPUs. Producing a 3 billion+ transistor GPU on that would be rather cost-prohibitive.[/citation]

Isn't that what NVIDIA has to figure out anyways when they want to compete with AMD/ATI & have lower power consumption as their new mantra for the Kepler & Maxwell? Now they want greater flexibility over performance for CUDA, I also feel this to be too late a time to change lanes, so don't ask me how they're gonna make 32nm Fermi & still make profit at a price people are willing to pay, because I don't understand that either. ;-)
 
[citation][nom]schizofrog[/nom]512 CUDA cores? Not much of a jump in the number of cores for a new high end piece. Is there any info on a change in the architecture over the GTX480?[/citation]There are two sets of rumors about this card. Specifics about the rumor in this news is that the GF110 is a 512 cuda core "fixed" version of the GTX 480, on a smaller die process.

Now before anyone goes off on some rant, if those rumors are true this could be one heck of a fast card. Die shrinks usually allow much higher clock speeds, as witnessed on the Radeon 6800 series.
 
Status
Not open for further replies.