Nvidia’s GF100: Graphics Architecture Previewed, Still No Benchmarks

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
i really had hoped gf100 was gonna have 3 video output. I was really hoping to support 2 desktops displays and a 3rd LCD TV for movies.
 
This Gf100 reminds me The launch of the Radeon HD2900. 512 memory buss, a Huge Die for that time and manufacturing.. regarded as the Gforce killer, late for two weeks and the fans said, its just suspence, 2 months late and a lot of specs and promises, finally many monhts later it was just an expensive blunder. Will Nvidia be doing the same???
 
Hmm, I'm not crazy about power consumption but even with the new 40nm tech Nvidia says that it will draw more power than a GTX285? I'm sorry that's a lot of power mixed with its computing ability and its 3billion transistors? how hot is this thing going to run? how large is it going to be? and don't get me wrong i don't take sides my last computer had a Nvidia graphics card, but my 5850 isn't the largest card on the market by any means but its powerful without the size and it stays cool even after overclocked. I wonder if heat is as much of an issue to this card as it sounds like it might be, and if so how practical it will be to overclock such a card and how far?

i have to say I'm curious about the programmable caches though.
 
I'm curious about the idle power consumption, I have a feeling it's not going to be good. For most hardcore gamers it won't matter, heck they'd SLi 4 together even if it drew 2KW at the wall, (heck isn't that what a 20A circuit's for !?!) as long as it performed better, but I'm thinking it's not going to be good at idle.
 
I didn't wait, I just got a HD5770, 'til things materialize. My biggest concern is heat generation, and power consumption. I only have one 20 amp breaker for both mine and my wife's office. ATI seems to be the best solution for both right now. Maybe I need to factor in rewiring the house in my next upgrade.
 
[citation][nom]henrystrawn[/nom]I didn't wait, I just got a HD5770, 'til things materialize. My biggest concern is heat generation, and power consumption. I only have one 20 amp breaker for both mine and my wife's office. ATI seems to be the best solution for both right now. Maybe I need to factor in rewiring the house in my next upgrade.[/citation]

You shouldn't have to as the die sizes and transistor sizes become smaller it should use less power at the same time the card is more powerful. so gfx cards and processors should start using less and less power (nothing like astounding leaps in power saving but less being the key word)
 


Only if they decided to keep performance relatively similar.

If what you said was true on it's own there wouldn't have been any need for a boost in power from AGP to PCIe 1.0 and then to PCIe 2.0

Remember also that smaller transistors doesn't mean less power if they don't take care of of the bad side-effect like leakage.
 
Nvidia Fermi 360 aka Geforce 360 and Fermi 380 aka Geforce 380 release 2nd - 10th of March 2010. Fermi 395 aka Geforce 395 release around 10th of May 2010. All schedules are prone to change.
360 vs 5850 = 360 winner
380 vs 5870 = 380 winner
ATI 5830 by 5th of February 2010, specs: 625Mhz GPU core, 800Mhz DDR5 memory, 1280 shaders, same PCB, cooling and 256 bit BUS as 5850 rebined 5850s.
AMD to release Phenom 2 X4 975 3.6Ghz L2 Cache 2Mb, L3 Cache 6Mb, 140 TDP in Q1 2010.
AMD Phenom II Thurban X6 1075T 2.8Ghz L2 Cache 2Mb, L3 Cache 6Mb, 125 TDP by May 2010.
New stepping arriving from Opterons reducing Deneb and other AMD CPUs by 15W - TDP ;-D.
All info should be taken with few grains of salt ;-D.
 
Status
Not open for further replies.