Nvidia’s GF100: Graphics Architecture Previewed, Still No Benchmarks

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

JonathanDeane

Distinguished
Mar 28, 2006
1,469
0
19,310
[citation][nom]randomizer[/nom]Call up NVIDIA and complain. They control what is and what isn't given to reviewers, and what is and isn't still under NDA.[/citation]

Usually when hardware is late and they keep everything close to the chest in my opinion there is a problem with the product. This I hope is the exception and they just have a monster on there hands and the performance is going to be even better then the new ATI line.

My crystal ball is seeing a stripped down version that performs good now then several revisions leading up the the "real" version being released 12 months from the launch when they get all the issues solved. (hardly a prediction since they do this a lot)
 

mayne92

Distinguished
Nov 6, 2009
743
0
18,980
[citation][nom]JonathanDeane[/nom]Usually when hardware is late and they keep everything close to the chest in my opinion there is a problem with the product. This I hope is the exception and they just have a monster on there hands and the performance is going to be even better then the new ATI line.My crystal ball is seeing a stripped down version that performs good now then several revisions leading up the the "real" version being released 12 months from the launch when they get all the issues solved. (hardly a prediction since they do this a lot)[/citation]
+1 Took the words right out of my mouth...
 

ethaniel

Distinguished
Jul 10, 2005
151
0
18,680
Guys, I know you're gonna get angry for this analogy, but if someone introduces me a cannon, I want to see that cannon being fired. Have you seen the demos? Non-moving cars? Water? Nvidia's whitepaper was a joke. For all we know, the GF100 could be cancelled tomorrow...
 

mjello

Distinguished
Jun 4, 2009
72
0
18,630
With the drastic changes , especially Out of order execution.

I think the first drivers will be buggy as hell. Or rather slow in any generic way in comparison to the horsepower behind them.
 

dingumf

Distinguished
Apr 20, 2009
313
0
18,780
Fermi; half a year late, still no specs,no price, no real benchmarks, TDP rofl, woodscrews (fake fermi) and 1.7 yields?

You'd think I'm lying but you fags know it to be true. FERMI IS A FAILURE

 

anonymous x

Distinguished
Apr 22, 2008
121
0
18,680
[citation][nom]dingumf[/nom]Fermi; half a year late, still no specs,no price, no real benchmarks, TDP rofl, woodscrews (fake fermi) and 1.7 yields?You'd think I'm lying but you fags know it to be true. FERMI IS A FAILURE[/citation]
does your name happen to be charlie? :p
We have some specs (read the article), benchmarks (far cry 2, dark void, and a part of the unigine heaven dx11), and the rest were rumors that have been proven to be false (execpt the woodscrews).
 
G

Guest

Guest
They need to have rather high clock-rates to keep up with the 5870, they could probably launch today if they clocked it at 400mhz, but then they'd lose the performance crown. Nvidia executives are expecting their engineers to perform miracles, if yields suck anyways, it's going to be really hard to bin these well enough to release this at a clock-speed sufficient to beat the 5870, I just don't see them pulling a rabbit out of their hat on this one. Hence the reason they won't finalize the clock-speed.

All of the bleeding-edge features in the world won't save Nvidia if the performance is low and the price is high.
 

kal20mx

Distinguished
Oct 18, 2008
151
0
18,680
But what happens if its faster then the 5970 and on top of that add 3D. I been wanting to play 3d for a very long time now.
 

dsolom3

Distinguished
Apr 10, 2006
45
0
18,530
[citation][nom]anonymous x[/nom]does your name happen to be charlie? We have some specs (read the article), benchmarks (far cry 2, dark void, and a part of the unigine heaven dx11), and the rest were rumors that have been proven to be false (execpt the woodscrews).[/citation]

Charlie usually writes in better Inglish.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
[citation][nom]kal20mx[/nom]But what happens if its faster then the 5970 and on top of that add 3D. I been wanting to play 3d for a very long time now.[/citation]

Stereoscopic display isn't a nVidia proprietary concept. They have a proprietary solution, but most companies developing 3D playback have chosen open solutions.

I would not be too worried if it can outperform an HD5870, the problem is it was released so late. Its like a stepping stone to the HD6870. It has 3 Billion transistors so you know it will outperform the HD5870 with 2 Billion. However, ATI has been doubling its transistor counts yearly and 6 months after the GF100 there is an ATI with 4 Billion transistors done using a Global Foundries fabrication at 28nm.
 

alextheblue

Distinguished
[citation][nom]reynod[/nom]I imagine NVidia will also be concentrating on ensuring the die is securely attached to the substrate.They won't want to cheese off the OEM's like last time[/citation]They cheesed off more than just OEMs with that freaking debacle.
 
G

Guest

Guest
If Fermi doesn't double the performance of a 5870, then it will be an absolute fail considering 5800 series features and power consumption. A 5870 is much more efficient than a GT-200 series and Fermi require even more power than GT200 to run that's ridiculous with 40nm ,that will be the hottest single GPU ever. The only market I see for Fermi are people with deep pockets that don't care about efficiency. Seriously we need SLI 2 cards just to enable 3 monitors???
 
G

Guest

Guest
If Fermi doesn't double the performance of a 5870, then it will be an absolute fail considering 5800 series features and power consumption. A 5870 is much more efficient than a GT-200 series and Fermi require even more power than GT200 to run that's ridiculous with 40nm ,that will be the hottest single GPU ever. The only market I see for Fermi are people with deep pockets that don't care about efficiency. Seriously we need SLI 2 cards just to enable 3 monitors??? I am now a happy 5970 owner but if Fermi gets 2x5870 performence in games , I will switch of course ;-)
 

JonathanDeane

Distinguished
Mar 28, 2006
1,469
0
19,310
[citation][nom]elie3000[/nom]If Fermi doesn't double the performance of a 5870, then it will be an absolute fail considering 5800 series features and power consumption. A 5870 is much more efficient than a GT-200 series and Fermi require even more power than GT200 to run that's ridiculous with 40nm ,that will be the hottest single GPU ever. The only market I see for Fermi are people with deep pockets that don't care about efficiency. Seriously we need SLI 2 cards just to enable 3 monitors???[/citation]

With the delays it has experienced it has to be much better or it will fail simply because if its not tons faster ATI's next card will be right around the corner to retake the high end. Some one early in the thread posted something about the drivers. This could be another issue with this card being so different from anything out right now it probably will take a couple of versions to get all the performance out of the silicon.
 

invlem

Distinguished
Jan 11, 2008
580
0
18,980
I'm still holding off until the reviewing community gets some demo units out...

I'm still fearing the possible return of the 'dust-buster' heat sink from the nVidia 5x era.
 

Pei-chen

Distinguished
Jul 3, 2007
1,282
6
19,285
[citation][nom]randomizer[/nom]GF100 is entering the ranks of Duke Nukem Forever. We keep seeing little glimpses but the real thing might as well not exist.[/citation]
I expect better from you, randomizer. You are comparing a card, when released, will be 6 months later than the competition but on time per Nvidia’s development cycle to a game that’s 12 years late; ready to be released four, five times; and from a developer that has no credibility left.

Nvidia, ATI, Intel and AMD all have stumbled in the past but they have always released their product whether good or bad. Comparing GTX300 series to DNF seems like a scare tactic to drive people waiting for GTX300s to buy 5800s now by telling them whatever they are waiting for doesn’t exist.

I believe people wanting to buy high-end card to wait till GTX300’s release before making their decision. If Nvidia is faster, they can choose between faster GTX300s or priced dropped 5800s. If Nvidia is slower, they have a chance to pick up cheaper GTX300s.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
[GF100 design is derived from GT200, which itself was derived from the almost-infamous G80/G92]

Is more accurate to say that the design derived from the G200 was discarded, and Nvidia rescued an independent design, initially designed only as a GPGPU processor, to take the place of the G100.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
I don't care about 3 display gaming. I do care about stereoscopic 3d. I'd prefer an ATI card, but they don't offer 3d. How come no one talks about that feature?
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
[citation][nom]hixbot[/nom]I don't care about 3 display gaming. I do care about stereoscopic 3d. I'd prefer an ATI card, but they don't offer 3d. How come no one talks about that feature?[/citation]

They can do stereoscopic 3D. They just aren't running up and down the street proclaiming they have made a closed hardware solution to it. Its up to the developer to include the necessary code to implement it. A developer is left with the choice of using an OpenCL/Direct Computer method to using 2 cameras and doubling the amount of frames needed a second. Or they can use a solution that only nVidia can use on cards that mostly don't have the muscle to display in 3D.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
[citation][nom]falchard[/nom]They can do stereoscopic 3D. They just aren't running up and down the street proclaiming they have made a closed hardware solution to it. Its up to the developer to include the necessary code to implement it. A developer is left with the choice of using an OpenCL/Direct Computer method to using 2 cameras and doubling the amount of frames needed a second. Or they can use a solution that only nVidia can use on cards that mostly don't have the muscle to display in 3D.[/citation]

Nvidia solution don't need developper support, altougth that causes problem with some titles, in particular with 2D images and 2D masks, as numeric scores, progress bars, aiming sprites, etc.
 

aungee

Distinguished
Sep 17, 2008
12
0
18,510
This graphic cards war is awesome!! If sources are correct, ATI will be releasing a product to reclaim the performance crown IF nvidia steals it with the GF100. Read here

On another note, i think there is something fishy about GF100 that nvidia is trying to hide. I think it will be something to do with power efficiency. This baby (GF100) is gonna suck some serious juice to do all those fancy stuff nvidia is teasing us with.
 

old_newbie

Distinguished
Feb 6, 2009
87
0
18,630
Was fortunate enough to get into CES and see the GF100 in action. First off, the GF100 was far from being showcased. "3D Gaming" was the topic that was in bold. "Powered by GF100" was in the fine print. Also, the rocket sled demo was on a computer tucked away in a corner. You had to really be looking for it. I admittedly almost missed it (was looking for "Fermi" not "GF100").
I am doubtful that GF100 will be available in Q1. The rocket demo crashed a few times when the presenter switched on the vector graphic overlay (to show all the physics being calculated on the rocket). Prolly a software glitch, but at this stage in the game (and on a in-house demo) you would think that there would be minimal bugs.

On a positive note, the N4S:Shift Tri-screen 3D demo ran flawlessly on the GF100. There was no screen tearing like I noticed on Dirt2 on ATI's Tri-screen Eyefinity display.
 
Status
Not open for further replies.