GT300 specs, gtx380,360 gts350,340

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

jjknoll

Distinguished
Sep 13, 2006
225
0
18,680
This site is generally pretty reliable in regards to new hardware. Looks like NVidia has some serious firepower coming our way. Hopefully, info is accurate and prices aren't obscene! Based on just the numbers they give, theoretically gts 350 and above will outperform 5870. Of course, these cards aren't available yet and we don't know what real world performance will actually be..... but still something to look at and drool over!


Main page
http://www.techarp.com/showarticle.aspx?artno=88&pgno=0


Nvidia

//www.techarp.com/article/Desktop_GPU_Comparison/nvidia_4_big.png



For the AMD guys

http://www.techarp.com/article/Desktop_GPU_Comparison/ati_4_big.png


It's hard to imagine; 280+ gb of bandwidth form a single chip card. It says max board power for the 380 will be 225w. I wonder if that means they snuck in under the limit using 2-6 pin connectors or have to have a 6+8 pin for OC headroom. I wish it was out NOW!
 
Solution


That's a silly statement to make to the RojakPot specualtion.

'Everything else being "mostly" equal' ?

WTF are you talking about? Very little is going to be 'equal', and how do you think 320 shaders = 1600 shaders?

There is no way to directly relate the two, especially since the HD5870 changes their RBE structure and their cache and buffer arangement, so whether or not they need or can use more bandwidth is another question, but as the HD2900 showed, raw bandwidth alone means very little.

And until we know how the shader, texture units and especially the...

randomizer

Champion
Moderator
I found what I was after. NVIDIA has their own definition of a GPU: "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second."

Since the GeForce 256 was the first consumer "GPU" to have Hardware T&L, it means NVIDIA "invented" it :kaola:
 

jennyh

Splendid
Yes pretty much. The sad thing is, after all those years have passed they now actually do believe that they 'invented' the gpu.

There really will be a few hundred - a thousand marketers running around Nv HQ who 'truly' believe that they invented the gpu these days, based on their own propaganda that fact long since forgot :D
 

jennyh

Splendid
Btw, I owned an Amiga, and I programmed it as well. You could actually program straight to the 'blitter' even back then, in fact it was the most efficient way of making games unsurprisingly.

We sure haven't made much progress in almost 25 years since....

PS I was young then, just before you get any really wrong any ideas about my age!