Nvidia 7XX series this december

G

Guest

Guest
http://www.geeks3d.com/20120830/geforce-gtx-780-gk110-in-preparation/ :eek: :eek:

.......... :D
 

dudewitbow

Dignified
i doubt it will be released in December(holiday season shopping?). They are atm letting the lower models to be pushed out and saturate a bit. If they released it to early, then some people will postpone their gtx 600 purchases as they know its about to be replaced soon, especially for the upper end models like the x70 and x80's. I'd expect like march or april.
 
AMD might throw something new out to market before the end of the year (probably not, but maybe), but I highly doubt that Nvidia will. They haven't even completed their current line-up yet and it hasn't even been a year since the GTX 680 launch, let alone since supply on Nvidia's cards finally caught up. It also wouldn't make much sense to me for Nvidia to basically devalue the low-end and lower mid-ranged Kepler-based cards right after their launch with the implications of Geforce 700 cards launching in the following months.
 
A lot of sources for releases like this are from Europe or Asia regardless of where the company is located. They seem to get info such as this earlier in many cases (although they also seem to spread a lot more lies than truth and when they're accurate, it might have simply been mostly lucky guesses) than the Americas.
 
G

Guest

Guest
I always see funny news like these from some of the sites. I like reading those especially the reader's comment accompanying those . Always hilarious.
 
Just another site trolling for traffic.
There have recently been a few sites that should be ignored wccftech is the main one that I have seen repeatedly linked in as a source for info on these and other boards.
wccftech does nothing more than parrot news and rumour that has already been around for a while.

On the subject of the cards in question. Well as with wccftech news, this is nothing more than what has been speculated on forum boards for a good long while now.
Its quite likely that there will be cards based on GK110 chips with disabled units at some point, where and when is all speculation at this time.

Mactronix :)
 
For most people now the buy or wait inner debate has shifted from needing more performance, to needing to justify the expenditure. I know it has with me anyway.

Chances are that if you already have a top end rig then that's not an issue you need to worry about.
If like the vast majority you have a PC that you like to game on but is not really a gaming rig then what you have is perfectly acceptable performance wise, so you don't need to upgrade and certainly cant justify spending more than you bought your last upgrade for. Especially when what you are going to get is going to be a small increase in performance over what you have already.

I may be wrong but I feel that Redcode is not entirely sincere about being about to buy a 680. Humour dosent transfer too well in text form.

Mactronix :)
 

Kari

Splendid

actually it's swedish...
SweClockers and all. :sol:


anyways, when they release a card based on the gk110 chip I dont think they're going to change the rest of the line-up, they'll just price the gk110 cards accordingly (ie really expensive)...
 
For me the timing of the release of any GK110 powered cards will depend on yields. It also depends on AMD being competitive.

Right now things are pretty even and also damn expensive at the top end. Last round AMD released first and set the baseline for pricing. Due to stupidly high AMD pricing and the people that bought them that is where we are now.

Nvidia really lucked out with this scenario as the fact that AMD are so uncompetitive they are able to release a card designed to be mid range at what represents top end performance. The Nvidia top end card has been very problematic so that's why I say they were lucky.
Had AMD made an improvement performance wise this round and had a card that had an increased performance of the level that Nvidia have managed then Nvidia would be all at sea at the top end.

This has allowed Nvidia to use 100% of the chips they are getting from GK110 wafers for the professional market and the huge profit they represent.
While this is going on I have no doubt they are binning the defective chips for use as either lower end professional chips or retail chips. What exactly they use these retail chips for is quite open to what they want to do as they have no pressure from competition they could develop new marketing strategies.

Mactronix :)
 

Kari

Splendid
^^well the gk110 chip is huge, it seems nvidia has been quite optimistic about the TSCM process' for few generations already. Had amd desingned a similarily sized chip they would be in the same boat wiht yield issues etc...
 

redeemer

Distinguished
May 30, 2004
2,470
0
19,960
90



Is TSMC ready for 500-550mm2 28nm dies? Either way the 780GTX will be a hot hungry beast, maybe 15-20% increase in gaming performance over the 680 , for sure a massive increase in Compute! Ig AMD launches the 8000 series soon than Nvidia will paper launch gauranteed!
 


AMD isn't being non-competitive. The GF110/GF100 chips, for example, were these huge 530mm2 behemoths, but they hardly beat out the far smaller Cayman (370mm2 or something like that). Nvidia abandoned hot-clocking, compute performance, memory bandwidth, and more so that they could fit as many cores on a GPU as possible. So, they now have far more GPU throughput per mm2 of die area, but they also have inferior tessellation, AA, and compute-oriented features (such as PhysX and DirectC lighting features) efficient. Nvidia doesn't need huge dies to compete with AMD's smaller ones anymore, so Nvidia's huge dies can be used purely for the professional markets where the high prices can make up for the yield problems.
 

redeemer

Distinguished
May 30, 2004
2,470
0
19,960
90



Yeah but we are not really sure about the direction Nvidia is going to go with the GK110, assuming that the comsumer part will be scaled down from 7 billion transistors. Wider memory bus, more SMX units, I mean the next refresh will still be Kepler based. Nvidia doesnt need huge dies to compete you're right but only for a gaming part, if they want to compete with AMD GNC than yeah their dies will have to grow for the compute aspect
 


The dies might have to grow for compute competition, but my point was that Nvidia no longer needs far larger dies for gaming competition with AMD.
 

redcode

Honorable
Sep 7, 2012
50
0
10,640
2
Hey Mactronix.. i was going to sell my 330$ Gtx560 Ti For 200$.. and add to them about 530$ to get the Gtx680..now with this news and specs i don't think that i should spent that much of money on a card that'll be outperformed by next year.. Got me now,,Hope u do !
 

redcode

Honorable
Sep 7, 2012
50
0
10,640
2

Will that makes sense. . i'll see what to do.. buying a GFX needs a lot of consideration specially when you're a video editor..
 



Comparing GF100/110 and how they only just beat out Cayman is just stating that I am right. I really don't understand how you can argue other wise.
Your saying Nvidia used to need huge great big chips to beat AMD. Correct.

Key Phrase here, used to that means they have made an improvement, I'm stating that AMD have not kept pace with this improvement. this has made them uncompetitive this round.

I'm saying this round AMD are not competitive, not when you compare performance increases from generation to generation.
The fact that Nvidia has a chip that is 23% smaller (well it would be as its only Nvidias mid chip) and out performed the AMD top chip should be enough proof to even the biggest AMD fan that they are just not competative this time around.

AMD had ~ 30% performance increase from their top end chip to top end chip.
Nvidia had ~ 40% performance increase from a midrange chip that outperforms its old top card by 20% and out performs AMD's top end card.
And they made a chip that was smaller than either Cayanm or Tahiti or the GF114 chip that was their old mid range.
They actually achieved a greater reduction in die size from generation to generation than AMD and still out performed them.

If GK110 did exist as a retail chip it would mean AMD had nothing with a single chip that could even get close.

Chip for chip AMD are way behind. As an example so you cant miss read my meaning. GK104 is the equivalent of Pitcairn chip for chip. I'm sure you can progress up and down from there.

I mean come on you think AMD cards are better than Nvidia cards at PhysX for Christ sake.

Nvidia abandoned hot-clocking, compute performance, memory bandwidth, and more so that they could fit as many cores on a GPU as possible. So, they now have far more GPU throughput per mm2 of die area, but they also have inferior tessellation, AA, and compute-oriented features (such as PhysX and DirectC lighting features) efficient.
Mactronix :)
 


Nvidia did not improve... Nvidia sacrificed features to pretend to improve. Your performance numbers are unbelievably off and far too basic to be accurate.

I never said anything about AMD having better PhysX performance. That wouldn't make sense considering that AMD doesn't support it unless you use a supplementary Nvidia card in unison with an AMD card. I said that they sacrificed all of that relative to Fermi.

Kepler has less tessellation efficiency, less AA efficiency, less PhysX efficiency, and less DirectC efficiency than Fermi. When you use these features and make comparisons, it soon becomes obvious that Nvidia didn't improve, they sacrificed pretty much everything in order to pretend to. Kepler is optimized for BOM, not for modern performance. That's why AMD has consistently higher minimum FPS (far more important than maximum FPS), consistently higher efficiency with any of these features (obviously excluding PhysX) and more, and has far more overclocking headroom and efficiency. Also, Nvidia doesn't out-perform AMD anymore. They only did when AMD didn't have proper driver support.
 

Similar threads


ASK THE COMMUNITY