GTX480 / GTX470 Reviews and Discussion

Page 24 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
37_dtemps(xbt).png
35_diagr_pw(xbt).png


30_480vs587_big.png


I love XBitLabs.
 
Also found the Metro 2033 comment interesting;
http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_11.html

"Moreover, I noticed the image to get somewhat blurry on the GeForce when running the test demo and checked this out with screenshots .... It is easy to see that the Radeon produces a better-looking picture. All the textures are sharp, without fuzziness or anything." 😗

Hmmmm... sounds familiar... [:grahamlv:3]

"Perhaps this reduction of quality is the trick the GeForce GTX 480 resorts to in order to deliver a higher speed in Metro 2033?" :evil: :hot: :evil:

Of course the official answer is no, it's simply a rendering issue which is being looked into, but this would've been handy for Randomizer's cheating thread. [:jaydeejohn:5]
 
the images from Metro 2033 DX11 looks like DOF is turned off on nVidia cards or like lower resolution of the image. Difference is not negligible, its extremely eye poking.

I am keen to see some official driver and some new benches after cards are released.
 


Eh, the difference is too huge imo. That wouldn't even be remotely subtle. I don't think 4G games is Bsing too much on this one. We'll see I guess once nVidia release its "official" drivers... and their cards *cough cough*

BTW TGGA, how the hell is it 96 where you are, you would be dead if that was in Celsius.... unless you mean Farenheit.... in which case that isn't even close to boiling water 😛

But you should definitely buy a fan, a very large fan, damn that's hot.... It's only 20 Celsius here 😛
 


I believe what this represents is true. But its the scaling on the color/temp that is a little misleading. Its almost exactly what Nvidia does in their bar charts. To exaggerate results.
14039_large_GF100-1.jpg

0-1 is say 1 inch~ then 1-1.8 is represented by 3 inches.

In that spectrograph they chose a temp, I would say 86.7(the temp of the 5870 core) as the mark where temp is represented by red. Then you see the core of the gtx 480 marked as 97.6 , also red. But most of the card is also red. So its somewhere between 86 and 97. The rest of 5870 appears 'cool' but may be just one tick below 86.7 and the GtX body might be one tick over 86.9.

edit: It also looks like the 5970 dissipates heat the best, at least the way it was tested. The center of the gpu dies appear cooler than the surrounding area. Which I gess is heat being transferred to the heat sink, so efficiently ? that the cores stay under the red curve.
 


Yeah I'm 100% certain it's a driver bug, or something where an optimization that works 99.44% of the time isn't working now or whatever. It just made me laugh after the whole banding issue came up the other day as 'cheating'. You will note that Xbit disables catalyst AI and renders everything at HQ like I mentioned in the other thread.

BTW TGGA, how the hell is it 96 where you are, you would be dead if that was in Celsius.... unless you mean Farenheit.... in which case that isn't even close to boiling water 😛

Nah man it's a dry heat, up here temperature changes every 15 mins, so it's all good boiling one moment snowing the next. :sol:

Nice thing about 96c boiling point is I can make morning coffee and oatmeal quicker. :miam:

 


I understand what you're saying, but I think it's more of a function of what actually shows up on the thermal cam, and it's a gradient not simply blue in one range and red for all the next range, it's levels thereof. The sensitivity is high enough, but the range is so wide you have to have it go from blue 40 to red 80 or else it wouldn't show much of anything, but unfortunately it's not setup for 40-110/120 which would be preferable, but might not show as much. You just have extremes in close proximity because of the massive concentration of heat of the GPUs in one area, the thing is that it looks like the GTX's board acts as more of a conductor as well since it share similar temps to the core and you can see that the board is 70 or less on the one card and above 80 on the other, with the centre points likely being laser temp readings not based on the gradient alone. The GTX480 is hot, but it's not alone, and even others can get hotter if you look at their previous test and seeing even higher numbers on other cards the tested in the past even that poor little passive HD3450 cooking away, and the HD4870 in Xfire being stuffed by its bunkmate (which speaks to my single card concerns earlier);

http://www.geeks3d.com/20090121/graphics-cards-and-thermographic-imaging/

It's not like the AMD cards produce no heat as one might guess by simply quickly looking at the front HSF pics in that first page and not taking the time to understand them, but it's also not like they are couched numbers the way both AMD & nVidia do with their PR slides, and you're suggesting here.

I don't see how you could do it any other way without getting a little more sophisticated (and expensive) sensors and changing your range.
 
As nVidia keep revealing more and more about the GTX400 series, we came to the conclusion of heat/price/performance issue, which is why for most of us the GTX 470 seem so appealing, but to those people defending this card performance, they forgetting the fact the the overclocking room in a much cheaper 5850 is huge compare to that of the 470, an overclocked 5850 gets much closer in comparison to an stock 5850. [I have decided to comment or post links, regarding this issue because of the people that think this will all change once nVidia release mature drivers]

the only way is watercooling the 470, but
Reports from forums peg pricing for the GTX 470 version at $500 and the GTX 480 at $650. No word on release date yet.
0-G-243664-3.jpg

http://www.tomshardware.com/news/gf100-fermi-gtx-480-gtx-470-FTW,10012.html

And Honestly I dont quite think they can compete at this price.
 
Not sure if this has been posted yet or not:
http://www.xbitlabs.com/articles/video/display/geforce-gtx-480_11.html#sect0

t is easy to see that the Radeon produces a better-looking picture. All the textures are sharp, without fuzziness or anything. To remind you, I selected the same graphics quality settings for both graphics cards: DirectX 11, Very High, 16x AF, AAA, Advanced DOF and Tessellation. Besides, I selected the High Quality texture filtering mode in the Catalyst and GeForce/ION driver (the Quality mode is selected by default). Perhaps this reduction of quality is the trick the GeForce GTX 480 resorts to in order to deliver a higher speed in Metro 2033? The game developer answered to our question promptly. Here is what Oles Shishkovtsov, 4A Games Chief Technical Officer said:

No, the observed difference in quality is not due to the performance of the graphics cards. Indeed, graphics cards from Nvidia and ATI render the scene differently in some antialiasing modes. We are trying to find a way to correct this issue. Again, this has nothing to do with speed.

Ouch...
 
Only way we'll know is when the patch comes out to fix it, is Nvidia running just as fast as it used to? Until we know that for sure, I'd learn towards the side of "It's nothing/a bug."
 
So under 1 setting very high and AAA there is trouble and you don't give nvidia the benefit of the doubt, just as much of a bug as AMD's AI interfering with crysis doing the listed AA. Anyways it's doubtful it will run as fast if it wasn't rendering it correctly unless the bug tasked the gpu but didn't actually do anything.
 


hard ocp has done a metro review using lastest drivers and they have concluded that the aa used by geforse is much better than what the 5870 can use. heres the link
http://www.hardocp.com/article/2010/04/05/metro_2033_directx_11_gameplay_performance_iq

they did the follow up
http://www.hardocp.com/article/2010/04/06/metro_2033_image_quality_followup
where the seem to say that it is an application problem and there only in some situations. It seems not to be a driver prob.
 
From my understanding quickly reading that review, the 480 can get playable frame rates in MSAAx4 while the 5870 could not so they bumped it down to AAA

So MSAAx4 > AAA in terms of performance drop? not to clear if quality is equal or not

so nvidia's bug for AAA not working properly would not be that great of a performance increase would it not?
 


Which has nothing to do with the segment you are quoting, because if you increase the resolution and thus put 4XAA out of the equation, then AAA is in play. Both can use 4XAA, but only one of them can actually use AAA without causing corruption. It's not an architectural difference as to what they can and can't use, it's a subjective gameplay difference as to which one they chose to use on which cards for their game experience preference. The difference being, that if you run the HD5970 or two HD5850 in Xfire, you can still game at whatever equivalent you want, but adding another GTX480 doesn't fix the AAA issue. :pouah:
It also doesn't fix the 8X AA issue in DiRT either, but those are simply bugs as we've already discussed, most likely to be fixed with patches and driver updates, etc.

they did the follow up

Welcome to the previous page where SS already posted that. [:bohleyk:5]
 


They do a comparison on one of the previous pages, with the shader based AAA vs 4X MSAA, and the MSAA is superior from the screen capture and the comments.

so nvidia's bug for AAA not working properly would not be that great of a performance increase would it not?

When testing 4XAA vs 4XAA, however in AAA then we don't know if it's running the shader based AA (thus the additional overhead) and then corruption, or if the corruption is a result of something being dropped or bypassed along the way, thus making the load a little lighter for AAA vs noAAA. For [H]'s test options it shouldn't affect that outcome since the 4XAA was used by the card with the issues, however in Xbit's review it might be at issue since all cards were running AAA not MSAA. Wouldn't be able to tell unless you know what's going on, and also had the option to run AAA and noAAA (which isn't possible at VHQ since you only have the 2 options).

It would seem had they tested at 4XAA on all cards instead of AAA then it would give us a consistent and accurate view at that level, although of course would not have exposed this fun little glitch. 😉
 
http://kingpincooling.com/forum/showthread.php?t=663

490MHz above stock on the GPU core, 980MHz higher on the shaders..... on only 1.35 volts no less. I don't even understand how that is possible... this is blowing my mind. That guy must be using some kind of theorhetical 1K cooling system..... and he thinks he still has room for more!!!?!?!?!?

And it beats the HD5970 by over 3000 points in Vantage.... holy crap... so much for the 480 not being able to OC well..... looks like nVidia owns the Vantage crown again 😵