Report: Nvidia GeForce Titan Might Outperform GTX 690

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780
The comments from fanboys stating how the 600 series needed to be set to clocks well below what they could've handled just due to keep competition alive make me giggle. So why did you pay so much for something that was set to perform under the possible specs? Both company's do this to get as much money back for the research they've conducted.

I am both an Radeon and Geforce user. Will be interested to see the specs, but I'm glad I held out for this generation if it is true.
 

ikyung

Distinguished
Apr 17, 2010
566
0
19,010
[citation][nom]A Bad Day[/nom]But will it outperform 8990 and 790 considering the fact that the Radeon 8000s and GTX 700s series are coming in the spring?[/citation]
This is a single GPU. The 8990 and 790 will be dual GPU. Also, dual graphic cards come later after the single GPU line like the 8950/8970 and 770/780. So, what we would have to see is -
1) Can the 8970 match this card? If it can match the performance of these cards, will it be cheaper?
2) By the time the 8970 comes out, will Nvidia release single GPU cards that out performs the titan?

Now, the above information are all rumors so we would have to wait and see in Feb if these beasts can really outperform a 690 by a 15% margin. If these really can... then we will have to see what AMD throws out to match these beasts.
 

Luscious

Distinguished
Apr 5, 2006
525
0
18,980
Let me guess - putting four of these in your rig for F@H will translate into some wicked PPD.

Now all we need is Asus to put two GK110's on a triple-slot card and slap the Mars III name on it. Can you imagine scoring X27,000 in 3DMark11???
 

spagalicious

Distinguished
This means nothing but more power for enthusiasts. There's no need for anything more powerful. The majority of games use the same amount of graphics power available 2 years ago.

Obviously there are a few that are demanding but there are very few people that spend of over $500 on GPU's let alone $900!

Now instead of playing BF3 at 105 fps with your 690, you can play at 125! Hahaha Nvidia laughs all the way to the bank.
 

RADIO_ACTIVE

Distinguished
Jan 17, 2008
897
0
18,990
Now we just need developers to create something that will actually push these cards to their limits. I still see no reason to upgrade with the state of games today (unless an enthusiast).
 

redeemer

Distinguished
Each company in this case Nvidia and AMD both have an idea of where the other is going to be performance wise every generation. So to say that a mid-range part the GK104 could compete with AMD's 7970 high-end part is absurd. The GK110 this time around was always going to be a compute part, Nvidia wanted to remove compute from its 600 series so it could prop up its Tesla line of products.

 

lpedraja2002

Distinguished
Dec 8, 2007
620
0
18,990
[citation][nom]spagalicious[/nom]This means nothing but more power for enthusiasts. There's no need for anything more powerful. The majority of games use the same amount of graphics power available 2 years ago. Obviously there are a few that are demanding but there are very few people that spend of over $500 on GPU's let alone $900!Now instead of playing BF3 at 105 fps with your 690, you can play at 125! Hahaha Nvidia laughs all the way to the bank.[/citation]

What a foolish assumption. This is the same as saying that high performance cars are not needed because a 3 cylinder vehicle can take me anywhere I want at 60 mph. If some people like to buy the most expensive graphics card each generation to power up their 6 monitor setup then of course more power is needed to run everything fluidly.

I can't afford such things but more power means cheaper "old-gen" products for me, so I say to Nvidia and Ati to bring whatever the hell they can, so I can upgrade from my GTX550ti lol.
 

wdmfiber

Honorable
Dec 7, 2012
810
0
11,160
[citation][nom]dragonsqrrl[/nom]This would only have been the case if gk110 had been ready at the time, it was not. It was still in development. If the HD7970 had performed a step better, in other words if an HD7870 had performed competitively with a GTX680, Nvidia would've been in trouble. Nvidia was fortunate that they could compete with the HD7970 with their mid-range GPU, because that's all they had at the time.[/citation]

Nvidia wasn't actually fortunate. They had to hack the compute performance out of the GK104 to do it. I'm not sure what went wrong with the GK110, after all... TSCM makes the 28nm GPU parts for both nVidia and AMD. But whatever the case, the yeilds are good now and the titan is coming!

And the GTX 780 and Radeon 8970 shouldn't be much of a worry. They're more of a refesh/name change; same 28nm tech. New GPU's won't be out till ~2014, 20nm.
 

spagalicious

Distinguished


Ah but the foolish assumption is that everyone shares your views. Cars are a silly comparison to computers. Its not like a 6950 will "just barely cut it". The vast majority of people play at 1080p and if you think the market thrives on those who have "6 monitors" and want to push to the limit, that's like saying keep pushing out ferarri's as it lowers the price of my audi.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
SLi scaling is 85-90% faster than single card. But GK110 doesnt have double the spec of GK104. GK104 is a Kepler, GK110 is also a Kepler. How could scaling go beyond 100% when the spec isnt. GTX680 175w TDP, to be twice of that its 350w TDP. I dont believe TSMC 28nm has mature this much to keep the chip running below 300w.

Dont believe this sh*t. Rumor has been saying GK110 is GTX680, look what happened.
 

Star72

Distinguished
Aug 5, 2010
179
0
18,690
[citation][nom]samuelspark[/nom]If it outperforms the 690, it'll outperform the 7990[/citation]

It will still be too expensive for a large part of the consumers though, that's the point.
 

wdmfiber

Honorable
Dec 7, 2012
810
0
11,160
@Tomfreak, that is a brilliant deduction. Thanks for taking a closer look at the math.

Benchmarks:
http://www.tomshardware.com/reviews/radeon-hd-7970-ghz-edition-review-benchmark,3232-7.html
So indeed, when taking TDP into account the gigaflops per watt seem too high for 28nm.

Maybe it's a prototype Maxwell GTX 880? lol... unlikely, someone photoshopped it :(
(GTX 680 TDP is 195 watts | 690 is 300 watts | Titan is 235 watts)
My rough math says to Score 7100 in 3Dmark 11, the TDP would have to be ~375 watts!
 
simply amazing. Current and last gen cards are already 'good enough' to keep up with the console ports coming with the next gen consoles, and now we find ourselves in the midst of a graphics arms race between AMD and nVidia BEFORE the consoles are released.

What does this mean? It means that by the time the consoles are released, and launched with a price that is sold for a loss already, you will be able to purchase a low to medium end GPU for less money, and still be able to manage just fine on future games.

Personally I am still waiting another year before upgrading my GTX570.
 

quark004

Honorable
May 18, 2012
37
0
10,530
i hope amd will have an answer for this card 'cause lets be honest only thing that will drive the down the cost of this thing is some competition from the red team
 
G

Guest

Guest
I have no need to upgrade my 5870 ATM, unless you play those tech demo's like Crysis there is no need.

But since 1080P is and will be the sweet spot for quite some time, i may upgrade to this card as it should produce a nice 60+FPS solid in upcoming games.. though i may wait for Source engine 2 announcement first. I don't like the idea of dual video cards, shame they still have micro stuttering.. and the companies know this because with triple SLi/Xfire there is barely any, this is why they only allow dual cards to have 2 links on a single card.. pretty pathetic.
 

knowom

Distinguished
Jan 28, 2006
782
0
18,990
I bet 4K display's had something to do with this and the sudden unexpected bump in performance knowing graphics cards could have a lot more pixels to push in a few years down the pipeline.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780
[citation][nom]swordrage[/nom]So I guess the much(!!) hyped unlimited detail from Euclideon thing can be released now.http://www.youtube.com/watch?v=DrBR_4FohSE[/citation]

unfortunately, if you didn't already know, point cloud darter is mainly for artsy type stuff. If you look for the videos where it shows it in action, any animations are extremely clunky, or litteraly fall apart. It sounds awesome in theory.
 
Status
Not open for further replies.