GeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!

Page 14 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ien2222

Distinguished
[citation][nom]azgard[/nom]This is just a cop out of Nvidia's failure. According to Nvidia they seem to think this is a gaming card and that this is the computing card. Keep your fact's straight and don't try to skew them.[/citation]

azgard: It's hardly a cop out. You just don't understand what's going on here. Up until now, both ATI and Nvidia have been designed gaming cards first, then they take that architecture and tweaked it to produce there professional cards. With Fermi, they are now starting to flip that around. There are concentrating more on the gpgpu side and then apply it to the gaming side. There is a ton of money on the professional/scientific side to be had, much more than they get on the gaming side, and they are going after it. Sucks that it didn't push the gaming envelope, but most definitely good for the company.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
Hype always kills the video card releases; we are promised hardware that "blows away" the previous stuff, only to get a small step in improvement - most of the time.

However, part of this is because of the way performance is measured sometimes, and I think that applies in this case. The "frame rates" number is not the big picture here, but the overall emphasis of both ATI and Nvidia on picture quality - especially Nvidia.

I have to fault Tom's Hardware a bit here in ranking the cards as being "better performers" because their no-AA numbers were higher, when the results would have been very different if the rank had been with 4xAA turned on. Nvidia has clearly made bigger commitment in this generation of cards to picture quality in not just extreme systems but for the average person in regular systems. Some AA is, in my opinion, a must unless you are willing to put up with a lot of jaggy on your LCD, so to rank cards based on their no-AA performance is about as appropriate as ranking them based on their 2D performance.

Well, maybe not, but you get my point :D
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290

I apologize, I didn't fully understand what you were saying regarding the manufacturing tech, although the wording in your post did seem to strongly suggest Global Foundries produced GPU's for ATI...

And I'm sorry you felt my post regarding Northern Islands was pretentious and distasteful, as that was not my intent. The predictions about the HD6000 series are based on facts, and all the information I provide outside the prediction itself is factual. A lot of the people on this thread didn't seem aware of some of the details behind this situation, believing that ATI was on a 12 month generation cycle. Therefore based on this false assumption alone, they went on to predict that the HD6000 series would be released in September, or in your case October 2010. What I'm trying to explain is that the situation is a lot more complicated then that, and based on current information and on details ATI representatives themselves have released, it doesn't look like the Northern Islands GPU will come to market by the end of the year.

There has been a lot of discussion lately about ATI switching its manufacturing tech from TSMC to Global Foundries, and while there is a lot of validity behind these claims, having excellent yields alone doesn't mean that the HD6000 series will be released any sooner then it would have on TSMC's manufacturing process. In fact it means the exact opposite, and if ATI does decide to do a complete conversion to Global Foundries it would more then likely mean a delayed Northern Islands in comparison to any potential release date using TSMC. Switching from one manufacturing tech to another is not a simple task, as no two manufacturing processes are the same (28nm at TSMC is very different from 28nm at Global foundries), and it's usually a long, complicated, and drawn out process. It certainly wouldn't help if Northern Islands is planned to be ATI's first GPU put into production at Global Foundries, a high end, high transistor count part. Various companies, including ATI, have historically 'tested' new manufacturing tech's on smaller and less complicated parts in limited volumes, in order to simplify and smooth out the transition. The fact that ATI has mentioned a plan to insert graphics processors into Global Foundries over the "coming years" is further proof of this, and if the HD6000 series is manufactured at Global Foundries it probably won't be the entire lineup for the reasons stated above.

Time is the biggest issue here, and I doubt ATI would risk the delays a complete manufacturing conversion of their high end parts would entail, especially given their new business model. Yes, even if the yields turn out being higher then TSMC, it will only be so after the GPU's are up in production. So basically if ATI does switch to 28nm Global Foundries, that means Northern Islands, AKA HD6000, definitely won't be released this year (or it at least has a lower chance of doing so then manufacturing from TSMC).
 

brisingamen

Distinguished
Feb 3, 2009
201
0
18,680
good point, its hard to find alot of detail on the manufacturing process of chips, i find it very interesting, and read whenever i can get my eyes on information, i think i confused the 4870 date with the 1gb version,

i personally feel there is alot of headroom left in the 5870 design, so i can see why ati wouldnt be in as much of a hurry to push things with the next series until they are just right.

i guess the questions really remain what do you think nvidia is going to be able to fix with fermi over the rest of the year, and how much can ati squease out of the 5870 design.
 
G

Guest

Guest
ATI is to be congratulated with the 5xxx series consumes less heats up less and have the same peformance of the GTX 480 in some games the GTX 480 is more than 5870 about a 5% ou 10% in some games for the Match GTX 480 consumes the same thing to 5970 and is much less than 5970 or a single GPU GTX 480 as it is a way.
 

tpi2007

Distinguished
Dec 11, 2006
475
0
18,810
[citation][nom]cardmonkey[/nom]Could you please verify the idle temps in a multi-monitor setup!According to this review it raises your idle to 90C: http://www.legitreviews.com/article/1258/15/If that's true those cards shouldn't have been released.[/citation]

Oh my! Thanks for the link. Oh my, oh my, the GTX 480, notwithstanding it's high TDP as it is, it does one better and consumes the full TDP at idle with 2 monitors!

Someone who has the money to go buy a card like that, probably has an older monitor too from a previous upgrade, and some might actually use it to have e-mail programs/internet browsers/mediaplayers/tv on.

Well, time for those people to go buy an air conditioning!
 

jeff77789

Distinguished
Jun 24, 2009
198
0
18,690
[citation][nom]snqwerty[/nom]I think that all the disappointment is useless.We are talking about new architecture, new API ( directx 11 ).If you want performance gain, then you should wait for the updated revisions of this architecture, like the 8800GTS vs 8800GT.I believe that we should stick to what we have for a little bit more. Competition will not let us down in the coming months[/citation]

a.k.a. the 5970
 

mutantmagnet

Distinguished
Jun 12, 2009
14
0
18,510
Now that you have the GTXs in your possession I hope you do "Part 3: Building A Balanced Gaming PC."

I've been waiting forever for that one.
 

Intershield

Distinguished
Jun 7, 2009
8
0
18,510
I will be buying one, maybe not right away, probably in the summer time.
The thing is, I have been burned to may times with AMD and ATI products and vowed never to support them again. I gave ATI a third chance not to long ago and it flopped on me again. Nvidia has yet to let me down with any of there products even if its not top of the line in current performance benchmarks.
 

seth89

Distinguished
Jan 23, 2010
77
0
18,640
im going to wait for the Dual GPU cards to come out on one PCB
so i think by summer time i will have one?
cant wait for DX11
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
[citation][nom]qwertymac93[/nom]i know, i just blew your mind right?[/citation]
So THAT'S what that link does. Huh. 'course, I'll admit that last time I'd tried clicking on it, (hoping it'd do exactly what it does now) it refused to believe I was signed in. Well, I guess that makes life much easier. (well, at least, the bit of time out of my life I spend reading/writing comments on Tom's, that is...)

[citation][nom]ripcase[/nom]a question I'd like to see answered is: "What are the chances of unlocking the dormant SM core(s) and thereby the cards' full potential?"[/citation]
Though dragonsqrrl mentions a good point, (cardmakers don't want people doing this, as they'd rather sell the more expensive cards) one of the REAL issues is that the whole reason the 480 has 1/16th the core disabled is for the reason of yields; with such a massive GPU, the chance of having a flaw land somewhere on it was much greater. Rumors I've heard put it at a measly 1.7% of all GPUs actually came out with all cores functional; hence, by only spec'ing for 15/16 of them, that increased the percentage.

So, while you MIGHT be able to re-enable them through some potentially tricky method, (likely a BIOS flash) your results would, more likely than not, be undesireable; at best, you'd likely have significant artifacting from using broken TMUs/CUDA cores, or worse, it could simply fail to work.

[citation][nom]dragonsqrrl[/nom]Correct me if I'm wrong, but as far as I've seen every high end card since has had unwanted shaders/cores disabled in hardware, making it impossible to modify.[/citation]
Well, there was also the case a few years back with the R480-based Radeon X800GTOs, where many users reported good success rates in using a BIOS flash to convert them into full X850XTs.

Also, I was mildly certain that the difference between the 9800/9800pro (at least, non-OEM versions) were clock speeds alone, and the "infamous" case was converting 9500s to 9500pros. (the 9500pro having the full 8 pipes of the 9700 series, but an 128-bit memory interface)

[citation][nom]seth89[/nom]im going to wait for the Dual GPU cards to come out on one PCB so i think by summer time i will have one?cant wait for DX11[/citation]
Don't hold your breath. Given that the 480 appears to have a TDP rivaling the 5970, it's already pushing the limit on what can be crammed onto a single card; I don't honestly think anyone would dare make a card requiring TWO 8-pin PCI-e plugs, since that'd effectively rule out using a pair of the cards. (I don't know of any power supplies that offer more than 2 or 3 such plugs)

Making a dual-GPU version will effectively require what has happened in previous generations, with the 7000 series, the 8/9000 series and the GTX 200 series had to wait on: a die shrink. G70, G80 and GT200 were too massive for a dual-GPU card. It took the G71, G92 and GT206 revisions, respectively, to fit, which were all smaller, cooler, and easier to produce. Since 40nm is pretty much brand-new for GPUs, (remember, other fabs aren't as far along as Intel is) it's gonna be a bit of a wait. 6 months is the absolute ROSIEST situation; Chances are >50% that it won't be until 2011.

This is what I spoke of in another article's comment about how worrysome it is that the 480 wouldn't impress; with little room to push the same GPU further, (unless yields somehow DRASTICALLY improve) this is basically nVidia saying "That's all, folks!" for possibly the rest of 2010. Since AMD still has some possible room left in Cypress, we COULD possibly see a 5890 before the end of the year; in that case, they could probably charge whatever they'd like for it.
 

0z0n3

Distinguished
Feb 18, 2010
2
0
18,510
If you gonna release a card 6 months after your competitor, for some reason I think it should outperform it by like, 100%, not 10%. Why on earth would people cough up $100 for like 10 fps. What ever happened to Moore's Law? Are we really seeing the end of it?
 

shock17

Distinguished
Nov 29, 2007
1
0
18,510
[citation][nom]eodeo[/nom]gotta love the competition. prices will surely go down and costumer will only benefit. sad that it took nvidia 6 months to get here though..[/citation]
Unfortunately the prices won't go down, because nVidia currently cannot deliver competition good enough. I only hope that the prices of 58XX ATI cards won't go up because there are some rumors that ATI are raising up the prices. :(
 

Tomtompiper

Distinguished
Jan 20, 2010
382
0
18,780
[citation][nom]seth89[/nom]im going to wait for the Dual GPU cards to come out on one PCB so i think by summer time i will have one?cant wait for DX11[/citation]


Can't wait for DX11, it's been available for 6 months? If you mean Can't wait for Nvidia DX11, then buy a 470 or a 480, if you insist on waiting for a dual gpu card from Nvidia I hope you are not disappointed, to get it in under the 300w limit it will have seriously underclocked 470 cores and would be lucky to challenge a 5970.
 

swamprat

Distinguished
Apr 20, 2009
98
0
18,630
[citation][nom]Zijn[/nom]How about a watercooled solution?.. I know its abit early for that.. But at 90+ degrees i can have a constant supply of boiled water for noodles or tea or something![/citation]

Admittedly I've not read the last 16 pages of comments - but I needed to respond at this point.

A graphics card that helps make tea must be worth something, they should build in a timer and a little fridge somewhere else in the case and every half hour or so pop out a nice fresh cup of tea.

Can it run Crysis? Who cares, "can it make me a cup of tea?" that's a real question
 

crazyrigs

Distinguished
Mar 29, 2010
3
0
18,510
67% more transistors than 5870 with only a small %age increase in performance...
I bet if ATI had a gpu with the same transistor count, it would beat this piece
of crap hands down.

I think tomshardware, fudzilla and some others have been paid by nvidia to hype
their garbage that runs hot guzzles power with minimal performance increase...where the environmentalists at...demonstrate or something....
 

pawan_iitr

Distinguished
Mar 29, 2010
1
0
18,510
I will like to see how gtx 480 fares against the 5850 crossfire. If the prices of 5850 go down to the original 260$ point. I think that will be much better options as 5850 crossfire performs close to 5970.
 

Tamz_msc

Distinguished
Also the performance in Crysis is very important IMO(even though its 2.5 years old).With the coming of DX 11 new features will stress the cards more but Crysis will remain the ultimate bemchmark.Sure you can add tessellation, newer AA techniques etc to stress the cards out but Crysis is still the game to beat with 1 GB of texture data, 85000 shaders and 1 million lines of code.Since the 480/470 fails to keep up with the AMD cards in Crysis, I, at least, will not be considering buying the GTX 480/470.There's still quite some time before we see a card that never falls below 30 fps in Crysis at 1900x1200 with all settings on max and AA enabled.
 
G

Guest

Guest
The nvidia card is much more than just a gaming card. Even as such it definately has a future. I think they will come out with a tweaked version later on with all 512 SP:s functional, also with lower TDP and heat. It was the same with the old 200 series. I have a 260 card and when the new series are a little more mature ill just add another of those and use my 260 for physx.
 
G

Guest

Guest
Minor mistake in this page. In the sentence quoted below, the article says "24 warps of 32 threads" is 1024 threads. Your calculator is obviously buggy, since 24*32 =768.

Is the explicitly stated thread count of 1024 or the implied count of 768 correct?

(Quote from article below)

"At the same time, the number of active threads per multiprocessor has increased compared to the GT200, from 1,024 threads (24 warps of 32 threads) to 1,536 (48 warps of 32 threads)."




 
Status
Not open for further replies.