Rumored Specs of Upcoming GeForce GTX 880 Appear Online

Status
Not open for further replies.

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
Why would they give it the GTX 880 designation if it isn't the full featured upper end model? So what if the performance cannibalizes the Titan series. Those are old architecture that need to go away.
 

Frank Tizzle

Reputable
Apr 8, 2014
6
0
4,510
I would hope that they researched this enough to find some legitimacy with some of the specs, rather than just re-posting it.
 

Bif Turkle

Reputable
Apr 11, 2014
2
0
4,510
I cant wait for more inflated performance on my 1920x1080 60hz monitor. I think $825.99 would be a good price to start this card at too.
 

hannibal

Distinguished
Maybe this will come Below titan... And Titan would be next uber model... But early rumors are always best served with salt...
The memory wide is actually quite believable because Kepler seems to be reasonable well feeded even with narrower memory bandwidth.
 

WithoutWeakness

Honorable
Nov 7, 2012
311
0
10,810
Why would they give it the GTX 880 designation if it isn't the full featured upper end model? So what if the performance cannibalizes the Titan series. Those are old architecture that need to go away.
For the same reason that the GTX 660 used the GK104 chip instead of the full GK110 chip. Nvidia's mid-range GK104 was performance-competetive with AMD"s high-end Tahiti chip found on the HD 7970. Nvidia was able to take their mid-range chip and sell it at high-end prices because it outperformed the competition and would sell at that price. Then while AMD evolved their GCN architecture for the Hawaii chips in the R9 290 series Nvidia was able to sell off their high-end GK110 chips for top dollar as Tesla compute cards and eventually roll those chips into GeForce cards for the 780, 780Ti, Titan, and Titan Black.

My guess for this generation is that it's the same deal. Nvidia feels that their mid-range GM104 chip will be competetive with AMD's offering so they will sell the GM104 as the GTX 880 and hold onto the larger GM110 chips for high-margin Tesla cards and roll them out later as the GTX 900 series.
 

The_One_and_Only

Distinguished
Feb 22, 2009
99
0
18,660
IF this is not some teenagers wet dream and some what credible, it could be that they plan on having titans from here on out and want the 780 ti guys to pony up more cash for the performance. Just a thought on probably fake info....
 


Since it's nVidia, they'll probably have a "Titan 2" down the road to bleed fanboys out of their money later on, haha.

If the specs are actually true, they sound more like an updated revision instead of a higher tier Maxwell GPU (was it Maxwell?). The 4GB of VRAM are actually hurting them in the 4K territory thanks to what the R9-295X showed, so they might be cooking something in between to justify the stupid price tags they are asking as of late.

Cheers!
 

soldier44

Honorable
May 30, 2013
443
0
10,810
Those that say waste of money clearly can't afford one or 2. These cards are for people like me that upgrade every year or 18 months cause they can, its a hobby. I game at 2560 x 1600 and have been for over 3 years now. Next on the list is a 4K display and maybe one or two of these bad boys.
 

nolarrow

Distinguished
Mar 27, 2011
87
0
18,640
That card was a goddamn RAID BOSS. Surprisingly, my "aging" gtx 570 is right up there in my all time favorite and long lasting gfx cards at my current 1920x1080 120hz rez.

in no particular order:

1. 3dfx voodoo 2s in SLI
2. geforce 256
3. 8800GT
4. GTX 570
 

nolarrow

Distinguished
Mar 27, 2011
87
0
18,640
That card was a goddamn RAID BOSS. Surprisingly, my "aging" gtx 570 is right up there in my all time favorite and long lasting gfx cards at my current 1920x1080 120hz rez.

in no particular order:

1. 3dfx voodoo 2s in SLI
2. geforce 256
3. 8800GT
4. GTX 570


I had to log in and copy pasted my message and lost the first part.

It was supposed to start with "I hope it lasts as long as my original 8800 GT"

Sorry for the double post
 

jrharbort

Distinguished
Jun 17, 2009
215
1
18,695
One detail not mentioned is how the Maxwell architecture utilizes a much larger L2 cache, allowing it to do far more on-die instead of having to fetch data as often from the higher latency GDDR5 memory. This allows them to get away with a lower memory bus, while still offering even higher performance (read up the details and benchmarks of the GTX 860M, which is already released).
 

neon neophyte

Splendid
BANNED
One detail not mentioned is how the Maxwell architecture utilizes a much larger L2 cache, allowing it to do far more on-die instead of having to fetch data as often from the higher latency GDDR5 memory. This allows them to get away with a lower memory bus, while still offering even higher performance (read up the details and benchmarks of the GTX 860M, which is already released).

that's a bold strategy cotton. let's see if it pays off for 'em.
 

MasterMace

Distinguished
Oct 12, 2010
1,151
0
19,460
If it's not a successor to the 780 Ti (15 SMX), then it's likely a Titan Successor (14 SMX), or a 780 successor (12 SMX). Using the Maxwell Model, a 15 SMX Successor would likely have 25 SMMs and 5 GPCs. What gets tricky is the ROPs, as we haven't seen a multi-GPC Maxwell - yet. I believe they are scaling it directly with the GM107. This would give 16 ROPs, 2MB L2 Cache, and 128-bit memory per GPC Disable 1 SMM and you have:

5 GPCs
24 SMMs
10MB L2 Cache
3072 Shader Cores
192 Texture Units
80 ROPs
640-Bit Memory Bus
6GB GDDR5 RAM

If it's a GTX 780 successor, disable 1 GPC:

4 GPCs
20 SMMs
8MB L2 Cache
2560 Shader Cores
160 Texture Units
64 ROPs
512-Bit Memory Bus
3GB GDDR5 RAM

I personally would love this. But if we're throwing out rumors, then here's a rumor via logic.
 

Not this stupid myth again...

The GK104 was a high-end GPU. It's almost as big as AMDs Tahiti, and much bigger than AMDs midrange GPU at the time, Pitcairn.

If you want to get into the discussion about who got the most out of each square mm of die, then it's AMD: The R9 290X is only slightly slower than the GTX 780 Ti, even though Hawaii is much smaller than GK110.

The size difference between GK110 and Hawaii is 123 square mm. The difference between Tahiti and GK104 is only 58 square mm. And the difference from what you call Nvidia's high-end GPU, the GK110, to what you call AMDs high-end GPU, Tahiti, is a whopping 209 square mm. There's no way these GPUs are in the same league. It's like comparing a humvee with a tank.
 
G

Guest

Guest
There was some earlier mention by Nvidia of a built-in ARM processor in the top Maxwell chips, which was rumored to be for processing high-def audio for HDMI. I'm wondering if this proc could also be programed with CUDA to do other tasks, like a compute boost or at least partially off-loading physics overhead. I guess we'll have to wait a while after the review samples get shipped. No matter what, new hardware releases from the big 3 are always pretty exciting for me :).
 

kiniku

Distinguished
Mar 27, 2009
246
68
18,760
I'm using a GTX 580. Its a wattage vampire for sure but still a very fast card for my purposes...mostly MMOs but some others. But an "860" series may be my next GPU upgrade depending on how it pans out.
 

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860


GK110 had such poor yields that it couldnt be used for the GTX 680. GK100 was also cancelled, too. Keep in mind that GK110 was around a 550mm^2 die with a new architecture put on a relatively new 28nm process. That is pretty much the perfect storm for getitng bad yield.
 
Status
Not open for further replies.