GF100 (Fermi) previews and discussion

Page 25 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Mainly was showing the pricing, as theres no real pics yet, but did want to emphasize the fan, or its size.
Also, power around 240-265 watts.
This is all estimations, and even the pricing isnt exact, but power and price very close
 


ATi lead, heck even created that market, nVidia followed. nVidia promoted it better, but don't make it sound like they were the first to go beyond the '3D Stuff' because they weren't. :pfff:

Anyway, I hope ATI and nVidia can keep up their R&D 😛

We'll see, but as pointed out last month, nV's R&D spending has dropped even with the continuing Fermi development. :??:

 


But on the GPGPU side of things, they were first with CUDA (a least, on a few years track). ATI always had the lead on Video processing (not 3D rendering), wich nVidia catched on way later with PureVideo. At least, that's the way I remember it xD!

Anyway, yeah; nVidia is actually yelling about GPUs being more than just "renders", more than Ati IMO.

Have any history facts on your sleeve, dear TGGA? (not wiki, please xD!)



PR, Marketing and lawyers are expensive, lol.

Cheers! xD!

EDIT: Quote marking.
 


So do I, but I won't ask him about writing the full story if he can find it already written 😛

Cheers! XD!
 
Take time to research my answers, they aren't put there to stir the pot of the Fanboi, they're there to correct the errors. [:grahamlv:3]



ATi was the first on the GPGPU side of thigns with Brook, Brook+ and CTM. It was cumbersome to use compared to CUDA, but well ahead in practical application as well.

ATI always had the lead on Video processing (not 3D rendering)

Always is a long time, and considering the age of the two companies, that statement is incorrect.
The other thing to remember is tat alot of GPGPU work is based on the early math require for image processing and how to transform images, not just display 3D. Alot of the early algorythms that were worked on were 2D compression techniques and their heavy math.

nVidia has done better in the workstation market, but even there, the lead is less about hardware than their driver and OpenGL extension work.

Anyway, yeah; nVidia is actually yelling about GPUs being more than just "renders", more than Ati IMO.

They are yelling more about it, but they don't have the history exclusively.

Have any history facts on your sleeve, dear TGGA? (not wiki, please xD!)

Sure, will this do, or do you need it Wiki'ed down for you? :kaola:

First commercial implementation of a Brook based rig using a Radeon 9800, see last page;
http://merrimac.stanford.edu/brook/BrookGPU.Merrimac.4-6-04.pdf


First consumer applciation would've been Folding @ Home, followed by the various video transcoders as well. Then there was the financial applications to things like minitab and Rapidmind.

Edit, first mention of Folding @ Home in the August updated version of the presentation;
http://graphics.stanford.edu/~hanrahan/talks/gp2/Hanrahan.Brook.GP2.pdf

Plus a good blurb from the Tech Report on F@H in 2006;
http://techreport.com/articles.x/11022

It's been theory for a long time, but getting it to market has taken a long time.
nVidia did a better job of promoting it, but it's not like they invented it, it was around long time before either company really started making a go of it.
 
381b8270166808.gif

f8655670166809.gif
 


Thank you very much TGGA, very informative as usual 😍

Cheers!

PS: JD, those are very nice pictures of almost any-video card out there from nVidia. xD!

EDIT: The chip mounting looks omfg-big. More than the GT200 😵!
 
Remember nVidia's first forray was actually into the inverse, making it easier for non-graphics people to write for graphics, using C language skills to program for graphics in Cg (C for Graphics). Kinda the inverse of direction of GPGPU and the inverse of CUDA.
 


Yeah, but that picture is of a sandwich card. I'll take the extra inch if it means being able to fiddle!

Edit: Hmm, maybe its not a sandwich.. The first one jay posted certainly looks like one.. I assumed with the whole in the PCB the second was too, but im not so sure now as there is clearly a spot for the GPU and ram... not sure what else they would need to put on another PCB.
 

Hi Notty22

How'd you like that game, eh?

The great Buddha says we must live in the present moment. For the present ATI rules. NVidioids will have to wait and wait and...
 
Anyone intersted in some real pics from the 480 taken on the German Cebit?

Here are 3 of them on Dutch Hardware site Tweakers http://tweakers.net/nieuws/65933/cebit-nvidia-gtx-480-op-de-gevoelige-plaat-gevangen.html

Showing the gory guts on the inside for a change... :pt1cable:

The images are of the A2 revision of GTX 480.
The GPU itself is 500mm2.
You can also see the heatpipes for NVIDIA’s reference stock cooler.
The card requires an 8-pin plus a 6-pin PCIe connector and comes with 1.5GB GDDR5 (12 memory modules of 128MB)

edit: as to not give away the manufacturer, certain parts have been masked with tape
edit2:typo
 
Still an A2 chip - I thought they'd been making A3 for awhile now. Maybe the A3 didn't help matters much in clocks if they are using A2 as demo boards.
 

I saw some pics earlier from a rig being used at CeBit that has an A3 ES board in it and the sticker on the card would suggest that it's still an underclocked version.
 
Status
Not open for further replies.