GTX480 / GTX470 Reviews and Discussion

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Whether we like or not, these cards haven't even been released yet. We don't know how they will play out once their hooked up to thousands of computers and the consumer gets to do his own benchmarks.

We don't know how Tessellation, PhysX, Efficiency, and everything that has been debated will change and effect everything. It's that simple. These arguments are, to be honest, silly.

We may think we know everything about these cards, but in reality we don't really know all that much. We complained before about only being able to read Nvidia's Benchmarks and it's kinda the same thing now. The average consumer hasn't had a chance to benchmark the 480 or 470. So give it a rest. At least untill April --th.

(I forgot the day their being released 😀. Was it the 6th then they moved it to the 12th?)
 
Yes I'm getting really frustrated with the fanboys lately. The GTX 470 is a good product, but it seems everyone wants to jump over to the ATI fanboy team and flame all that is nVidia, except for f&^king PhysX which some fools still believe has a purpose...

Its enough to make me actually angry, on a damn forum...
 


Granted, they are that as well, but like I said for me it's

a) gpu crown
b) drivers
c) price
d) heat
e) efficiency

which is why alot of arguments are flawed because each user has their needs. I am a proud enthusiast I'm not a shopper. I use to be but now that I got a decent job compared to my part time i have a bit more cadhflow at my disposal. So I put it into a tasteful hobby.

:) to each his which isn't fair to pass judgement and speak for wat the public needs, there is no such thing wat the public needs, it's wat the user needs.

My main needs heavy power, my secondary needs to be decent power and a media centre with the 5870 xfx, Andy dads needs to be tv only, which is 3450 hdmi out. Etc
 


GPU Crown is 5970

drivers are anybody's guess. you say nvidia will get better drivers, i think nvidia had time tow ork on the current ones. can't judge and predict the future

price is expensive

heat is plentiful. overclock room? nobody knows

efficiency is nil
 



Read my post I said GPU crown the Video card Crown. The 5970 is a dual GPU card.

Look 6 months to produce the cards doesn't mean drivers have been tweaked. ATI had how many years of 5000 dev? Would it be the same case? The delay was creating the card, they didn't have the card for 6 months and keep delaying it, they physically didn't build it yet, remember how many "fake" cards they displayed. Hanging wires etc.


P.S

These are reference cards, when distributors get the cards, we might see some better temp readings, I dunno about power draw, but depending on the cooler type they choose and the Artic silver paste that company prefers, we'll see whats new:)
 
Well, it all depends on peoples choice, some would not care about consumption or heat or oc , just want the fastest single GPU as Dual-GPU solely depends on Driver Update to get more perfromence, I remember getting awfull FPS on Dirt2 with a 5970 at release now it is more than four times faster with drivers fixing CrossFire Scaling. Some people like efficency, more oc headroom on air, less heat. My opinion is the 480 should be priced at 449$ then it would be a good buy but I just love overclocking and efficent cards , every one got a different opinion.

If I had extra cash right now, sure I would buy one and test it and sell it if I dont like or need more cash lmao

Starting a complete new architechture from 0 can really take some time.
 



I know how you feel man but take it easy, everyone is free to have their own opinion, if you like the product that’s great, get one and have fun with your games or enjoy the shorter times it takes to do renders with Cuda, that’s why you are going to buy one, not everyone that has something bad to say about the product is an ATI fanboy, I do love ATI and AMD but I use what ever I can afford ATI or Nvidia, AMD or Intel.

I think what you have to remember is that Nvidia’s marketing machine has kept many people from buying ATI with the promise of the second coming of Christ in the form of Fermi, they are 6 months late to the game and they provided an incomplete product with the GTX 480, the thing also eats power like a monster, heats like an oven and has the noise of a vacuum cleaner, I think that’s a step back on an industry that is trying to bring efficiency to the table, but that’s my opinion.

The GTX470 is a good product alright, is just 7% slower than a HD 5870, it’s cheaper and in a lot of cases gives you better minimum frame rates, it’s also pretty good with DX11 games and has a good OCing room, but the GTX470 is not their flag ship product the GTX480 is and in my opinion it not enough, we will have to wait and see how the 512 Cuda cores product turns out after they leave behind all the problems they have with the not-so-new-anymore 40 nm process.

All in all, like I said before, Nvidia is or will be shortly in the market with a competent product and that’s good for all of us who want cheaper prices.
 



if i remember it correctly, the problem with FERMI wasn't that it wasn't built, it was that it was hard to produce in high numbers. the fact that the gtx 480 had to shut down some cores doesn't mean that it wasn't built. the architecture would've remained the same.
 
Hers the 480 DriectCompute and OpenCL perf.: Impressive! 😱

http://anandtech.com/video/showdoc.aspx?i=3783&p=6

Interesting that Anand links to the source of this OpenCL tests, but doesn't even bother to read the caveats about it (the biggest being it's adapted from a CUDA model (usre he says it's not a port, but the new code is equally bad on non-cuda hardware due to the thinking behind the code) [:thegreatgrapeape:5]


"Unfortunately, for some reason, AMD's OpenCL compiler crashed when compiling my kernel for GPU (it's ok for CPU version though). So right now it doesn't work on AMD's GPU at all, but with AMD Stream SDK 2.0 it's possible to run on CPU devices.
...
About the crash problem: the original kernel I've developed (i.e. the kernel used by clcpu path) doesn't crash the compiler. However, it's extremely slow (> 20 s for 16 queen on my 4850, and > 7s on 9800GT) because it uses a four arrays to simulate a stack for recursion. These arrays are good for CPU version because they reduce the amount of computation, but for GPU they are too hard on registers. So I developed another kernel which uses only one array, but it requires more computation to generate/restore data for each steps. However, this new kernel crashes the compiler (it works when selecting CPU devices, but it's slower)."



Why would you even bother using it as a testbed? It's as biased as tunning an F@H test. :pfff:

I expect it to be great for GPGPU, but these tests are obviously flawed if the GTX285 is outperforming the HD5870, and stays even that close to the GTX480 which was claiming far more than just a doubling of G200 performance.
 


This is different from the forum reaction to the HD5870 launch how?

= Aw it doesn't beat the GTX295 in everything, it's more expensive than two HD 4870, it's not globally 60% better than the GTX285,...

Same thing with the HD5770 about it's bandwidth, pricing versus EOL cards, etc.

Pretty much any new launch get's that attention.

Right now not enough detailed testing and 'un-aided' reviews out there to get a full picture yet (like true bandwidth utility, and the benefit of the shader and cache redesign) and to early to even discuss pricing, availability and competitive reactions. For now it's pretty much what most people expected recently (not the overblown Oct expectations) just like the HD5870. All that DX11 hype brought on by the DiRT2 #s evaporated pretty quickly when the reality came to light as to why the numbers were great which had nothing to do with DX11 design other than it wasn't needed for those #s.

And really it's still a launch that is more important to have simply launched, more than it being spectacular. It's good enough for that and shows that if it's possible to improve the yields and the temps (thus increase clocks) that there's good untapped performance refresh waiting there too.
 


Wait for it ... 😀 . There are so many things to bash about that sooner or later its turn will come. :na:

So I guess Charlie was right....
 
i brought up ST since it was a particular focal point for most of the ati-fanboys back when it was announced. like "good luck with that ST-engine, its gonna suck".

seeing the end results with first gen dx11 games, can we say that it at least sufficed? or on par with ati's solution? i want to see some fanboy vs fanboy action regarding ST so to speak.

im not here for an argument because im sober and had already decided that these gtx400s arent worthwhile. and considering the performance of the flagships, the budget-crippled-derivatives would be daunting as well.
 


There is a world outside the USA and for the rest of us the GTX470 isn't a good buy hence why it gets talked down. In the UK it's going to be retailing at a price that makes it more expensive then the HD5870.
 


Then that doesn't pertain to you does it?
 
there is a specifice thing mention in legit reviews... Dont knowwether this has bee brought up.
http://www.legitreviews.com/article/1258/15/

dual lcd monitors at 90 c IDLE.... 😱
 


Yep. C/P from the end:


<<In my personal system (Corsair 800D Chassis) with two monitors the GeForce GTX 480 graphics card would idle at 90C and if it was a sunny day and my office was warmer it would idle at 92. I fired up the new DX11 game title Aliens Versus Predator and with GPU-Z in the background I saw the temperature reach 99C while gaming for around 30 minutes. At this temperature the fan is spinning at 70dB and it honestly was not an enjoyable gaming experience. I asked NVIDIA if the card was built to run at temperatures this high and they claim that the GeForce GTX 400 series was built to operate at high temperatures and reminded me that 105C was the peak temperature for the GeForce GTX 480 video card. While benchmarking the GeForce GTX 480 graphics card on the open test bench I found the outside of the heatsink to reach 50C on the fan side and 59C on the exhaust side, so this card without a doubt will put out some heat. >>


90C Idle
99C Gaming and 70dB Fan noise? 🙁 Maybe a Tri-Slot cooler with 3 Fans?

Noise is tolerable for some but NOT at 70dB, thats insane. :ouch:
 


However it's not software-only as was initially the belief with the early GF100 slides in October, once the polymorph-engine was detailed in January, people no longer had that concern.

It's still not the same as having the feature at a per unit level, and it looks great when it is hammered with just one type of function like is the case with Unigine, it's still very early to be talking about how well/poorly it compares with the competition.

Sofar there's very little to go on other than the PR material given to the select launch people.

I'll wait for the custom tests to come out later from places like B3D. It could very well be as inefficient as the R600's shader-only AA, but it's working due to simple brute force.

This is not October's Fermi when Tessellation and Textures were a big question mark, and nV was even talking about the possibility of NVIO. Alot has changed since then.
 


No, the tessellation is done in the polymorph engine they use for all of the geometry on the cards. There is an intimate relationship between the polymorph and shaders though.. I have not read a great description on how this works exactly yet.

Nvidia has dedicated tessellation units, it simply does it very differently from ATI. Mind you, shaders still have to call the functions...
 
So if the polymorph engine does shader and tessellation, wouldn't that start to suffer? Maybe a game like L4D3 (when/if it comes out) that uses lots of shader with perhaps good amounts of tessellation.
 


No, the polymorph engine is the geometry processing. The shaders are independent. It seems likely though that the polymorph engine is much faster at tessellation than the rest of the GPU is at other tasks though, so the benefit is not pronounced due to bottlenecking.
 
Yep. C/P from the end:

90C Idle
99C Gaming and 70dB Fan noise? 🙁 Maybe a Tri-Slot cooler with 3 Fans?

Noise is tolerable for some but NOT at 70dB, thats insane. :ouch:

99 is pretty freakin' hot, I agree with that.

70dB noise isn't even slightly close to insane, that is quieter than the dial tone or your telephone and is approx the same volume as a "normal" conversation about 3-5' away. In the grand scheme of things, I doubt it would make your computer much louder than it already is if you have air cooling of any kind in it. My water cooled PC is still like 48dB or something, 70db isn't much higher than that.

http://www.gcaudio.com/resources/howtos/loudness.html

So I don't quite understand why everyone screams bloody murder about 70db...... if it was 100db and you started suffering hearing loss... then I could see people being angry 😛 My surround system could overpower 70dB at like 5% volume 😛