GTX480 / GTX470 Reviews and Discussion

Page 21 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Wish that was viable but if this mores law thingy keeps going the way it has been since the mid 90's with graphics vs. games (and computers in general), we're always going to have to CF or SLI with multi-monitors if we want to play the next Metro or Crysis. Maybe there is a chance that multi-m will become so mainstream that the nV and ATI push the performance to meet the requirement it comes with. :ouch:[/quotemsg]

I'd say graphics power has done very well up against games. Since the mid 90's we've increased our resolution by several-fold. The real use of Eyefinity is that today's graphics cards do so well, we're more-or-less arbitrarily giving them more to do (higher resolution = more pixels to render). The cause of eyefinity isn't that games are outpacing graphics, it's that graphics are outpacing games + screen manufacturing (largely LCDs, currently). I mean, how many threads have there been about "I'd upgrade, but there's no reason I need more power." For some people, GPUs are getting to the sort of wall that CPUs are at. They don't need any more power.

All that granted, I still think Eyefinity is pretty cool, even though there's no chance I'll be able to use it in the next couple years (small apartment/desk/student)
 
I can see a benefit to 3-6 monitor setups in the IT environment. I'd love to not have to switch between the 4 or some RDP connections and email client so often.
 


WHAT!, I dont get what you're saying....

Its been a long,long time where is better to have a single card solution, when we talk about P/P.

Now ATI has done the same with Eyefinity, making one card able to run more then "2" Monitor[2 is the keyword], because cards has been able to run dual monitors for a long time, and use apps via programs like Matrox.

What ATI has done is release a hardware capable of running more then 2 monitors without any software interference and apply this to gaming, no benefit with 2 monitors, add 1 more you get a better experience.

so we are already almost to a point, where 1 graphics card can run triple setup at playable fps and enough eye-candy, because P/P 1 card is the winer and ATI knows this, its just no viable to have to buy more then 1 GFX to try 3 monitors, specially if one graphics card cost you $400, give it another year, i say less, to see single cards config, push more than 3 monitors, hell if one 5870 can push 3x30" and get 30fps, what it can do with some 23" and 1920x1200 its out of the question, there are plenty reviews out there about this.
here http://www.hardocp.com/article/2010/03/14/first_time_gaming_experience_ati_eyefinity/
go see for yourself, cuz I know lost of ppl don't believe in heaven until they see it!
 
As an rts person multi monitor = no at all nice for me, i want to see everything without bezels it brings no benefit of being "immersed" i could see it in 1st or 3rd person games but that's it, now 3 monitors in a work environment is very nice.

Frankly i not a huge fan or 3d or multi monitor gaming both i rather have had them spend more time in better drivers and arch :!

I'm a believer in if you're going to do something you do it right 100%, it's why i don't care for cell phones that are really just tiny crippled computers, let alone what it means to be outside and your form of socializing with people is talking though a cell phone or worse texting people, i esp hate poeple who talk and text to other people when they have friends right next to them wth is up with that. But i digress.

It is not unreasonable for nvidia to assume that if you're lunging for 3 monitors they are going to be high resolution and you will want to have a sli set up.

Nvidia already said, or something mis-quoting them in an article i read at some random time, they had the tech in software for quite some time they just didn't think anything of it.

Nvidia didn't go into their fermi arch thinking hey lets make it play with 3 monitors lol they probably were more concerned with shitty yields to think much about the pbc it was going on.

now with one needing to be a display port and a confusion about adapters along with the 5670 being the bottom of the barrel in terms of being able to do this and a lets assume 100 dollar premium on an adapter to an already 100 dollar card. you're throwing down a min of 200 bucks for a 3 monitor solution and i'm pretty sure you can't game on a 5670 in eyefinity,

now on a 5850 i'll but it so 300+100 = 400 bucks, if you played any attention to w.e article i read about nvidia's set up the older GT200 chips will be added to do the 3 monitor gaming. meaning if you already have 2 260's or 285's etc you can when ever the driver get's released game on 3 monitors meaning it will cost you nothing!

wait did i have a point i forget lol.

something something something

just becuase you can run a game off 3 monitors via 1 card doesn't make it a good idea esp when you pay an 100 buck premium on an adapter for your monitor on top of a 300 or if you're shooting for that 5970 700 dollar card. i mean if anything i rather buy two 5850's and not go display port rout although not sure if you can do that.

Nvidia's way is if you want to game on 3 monitors you need 2 cards not unreasonable if you have high fps on a high resolution high detail game. esp when you can afford 3 monitors. And you forget that nvidia's driver update that will have this will also work on the GT200 chips along with the GF100 chips meaning the so many ppl with 260's in sli could just go to this if they bought 1, 2 more monitors.

work space wise you could just always buy a matrox solution for about the same price if not cheaper.

again i don't think i'm hitting a point ranting like this lol.

oh yeah snake becuase nvidia's solution will support the GT200 it means those with 260's in sli could do this and it wouldn't cost them a new card. yeah maybe that's the point lol.

along with spending money on it

also lol after some digging after seeing your video

http://hardforum.com/showthread.php?s=eb8c00a7c3dea8944cbb027619d1dc1c&t=1502988&page=2

read the thread that goes with it he did not use a 5870 he used a 5970 he also says it's a 5970 in the video that's a 700 dollar card lol. hell you can get 2 5850's for that and if you want scaling you could get 2 470's if you could buy it now lol for that price.
he barely got enough fps in the games he just barely tried out i don't think a 5870 could push 3x30" at 2k resolutions very well at all.

rage* maybe ionno don't feel angry

haha time to put this mess up there as a post
 



We weren't arguing nVidia Vs ATI solution, simply discussing getting to run more than 2 monitors with 1 card, because like I said MOST people see that 1 card solution is better as P/P goes, it also gives you upgrade-ability down the road to slap another one in, if needed.

Display port adapter: Yes problem ATI is working on, they are trying to release something affordable, so thats being taken care off.

Not sure what are you talking about, a 5970 getting low game frames with 3 monitors, dude, go back and read the review that I posted about a 5870 doing 3x30" monitors.

you're clearly an nVidia Fan, so you like to take opinions and take a side, well I'm not a fan of anything, but PCs and I buy what i can Afford, and im sorry to prefer ATI at this point over nVidia, by the way I didn't post a link of 5970 and 3 monitors, because you wont read it, you to much of a fan, that even seeing rather not to believe, so you chose to believe what you want, and thats fine by me, doesn't bother me, but don't argue the facts of something that i said, and articles to prove it, make no sense.

read the thread that goes with it he did not use a 5870 he used a 5970 he also says it's a 5970 in the video that's a 700 dollar card lol
mind quoting what he said, went to read it, didnt saw it, and i was to lazy to search the whole thing, just quot what he said. :)
 
As i listed in the first post

http://hardforum.com/showthread.php?s=eb8c00a7c3dea8944cbb027619d1dc1c&t=1502988&page=2

it's the 3rd post down, post number 23

Originally Posted by marzz
Hay Kyle, was wondering what monitors you were using in your setup?
Once again, great vid and thank you.

Mar


Dell U2410

Originally Posted by ChiZZad
What are your full rig spec Kyle?


Core i7 920 at 3.6GHz
12GB Corsair DDR3 at 1440MHz
ASUS P6T6 Workstation Revolution
ATI 5970

The people asking the questions are in purple as you can't put quote's inside quotes for this forum
And kyle's responces are in red.

http://www.dell.com/us/en/dfo/peripherals/monitor-dell-u2410/pd.aspx?refid=monitor-dell-u2410&s=dfo
the monitor is a 1920x1200

if you're too lazy even to check when he says it's a 5970 in the video the it's at about
~ 2:08
 
Also i checked your pc perspective linkg

1 they have one hell of a system
ASUS P6T6 WS Revolution X58 + nForce 200
Intel Core i7-965 @ 3.33 GHz
3 x 2GB Corsair DDR3-1333 MHz
Intel X25-M G2 160GB SSD

2 i found this little gem
One question that I am sure will come up is "why didn't you use multiple GPUs in CrossFire mode?" A great question with a sad answer: it doesn't work yet. As of this writing, and probably through all of 2009, Eyefinity displays will NOT support CrossFire acceleration.
I actually think they did fix that allowing crossfire now i do find it funny as nvidia's solution is always in sli, and ATI released their new 3 monitor solution without it being able to crossfire something that crossfire and sli were made for high resolutions that is.

First, we saw that performance scaling was dramatically different depending on the title and likely in each particular section of each game. In both Far Cry 2 and Resident Evil 5, the Eyefinity performance results were 2.5-3.5x SLOWER than our results using a single panel.

also besides hawk, and batman at which was basically on medium settings without the phsyx and other thing (as one can argue phsyx is about as necessary in games as 3 monitor support) along with farcry 2 without on medium imo is unplayable i looked at the graph it's min frame rates are too far in the crapper.

out of 5 titles it only kind of works on 3, i say that becuase RE5 didn't scale correctly
Resident Evil 5 didn't visually work well in our AMD Eyefinity testing - the resolution just didn't scale correctly.

This tells us that scaling on Eyefinity is really going to be on a case-by-case basis and that simply assuming a 3x performance drop is both wrong and foolish. As we said, even with just a single Radeon HD 5870, Batman and HAWX were completely playable at 7680x1600 with 4xAA enabled! That is an impressive feat even though we are talking about frame rates in the 30s rather than the 60s that I know a lot of our readers will want. To that point, AMD is still committed to bringing CrossFire to Eyefinity in the future, just not soon enough for me.
even your review that really isn't praising eyefinity but more liking the idea and being hopeful says that it's pretty much mean for crossfire or sli for 3 monitors.

performance drops is not logarithmic as you add AA or as you bump up the resolution it's not to cut dry, you could have very similar performance between 3 2560x1600, monitors and 3 1920x1200 monitors. Could be more dependent on the raito more likely dependent on the game, personally i'd love to see game per game how nvidia's solution vs ati's solution preferably the 5850's vs 470's

imo when it's extremely game dependent does it become worth it?
I believe most people on this forum see a gaming system is made to be for all games not for just a select few, esp when you drop 1k-2k down just on the computer not counting the monitors that thing should play all games well not have to fiddle around and baby a few games awhile only taking advantage of another group of games.

It's back to doing things right, not doing everything at well just well enough.
 


Yeah they are avoiding that headache and instead giving people the mini-HDMI to HDMI headache. :sarcastic:
They aren't avoiding the dongle question, they don't have the option, there's no question.
And active adapters work period. It's only an issue when people want to cut corners and go passive without understanding how things work or bothering to look at the support documentation.

This generation they just needed to get Fermi out the door without any more complications, it seems.

And that's it more than anything else. There's no concern about dongles, anymore than there was about which panels to support for 3D stereo, it's just a case of not being a priority for getting the part to market.
 

this assumes that most 480's and 470's wont come pre bundled with a mini-hdmi to hdmi adapter already in the box, considering it cost like 5-10 bucks to get one for us i don't see why companies that include dvi to hdmi adapters wont bundle a cheap adapter in with the cards.

Which is a different story when you talk about a 50-120 dollar active adapter needed for display port a still pretty unused port which i find odd as it is royalty free unlike hdmi.(lol just realized that emocon is scarcasm although it's always hard to tell what is meant when people are scarcastic)

I'd put the mini-hdmi as we will see how it goes as it's relatively inexpensive to fix still a minor annoyance.

along with the part where you don't need to use display port to get 3 monitors from what i've been told.


Anyone care to update me and does ati now allow this or does it have to be display-port can't avoid it by going crossfire.

I will also blame users of fudging up getting the wrong adapters but when it cost that much money it's to be some what expected people will try to cut corners.
 


Depends on the mfr, and likely on the level of the bundle, not all current cards come with DVI-HDMI adapters or even DVI-VGA or even much of anything. Some of the early cards on NewEgg show they come with them, some don't mention them and seem like spartan bundles.

Which is a different story when you talk about a 50-120 dollar active adapter needed for display port a still pretty unused port...
...I will also blame users of fudging up getting the wrong adapters but when it cost that much money it's to be some what expected people will try to cut corners.

Which would be similar to the $250 Active shutter glasses. Both are a luxury for people who don't need it, and easily obtainable for those who are already going to be shelling out money for the other aspects required.

To me it's like people asking to take short-cuts on anything else, you get what you paid for, and if they aren't willing to spend the time to do the proper research on this thing they wanna spend $500-1000 on (card included), then they don't get my sympathy and I doubt either IHV is going to be willing to spend the time on them either, especially since the info is already there for the taking.
 



by the way I didn't post a link of 5970 and 3 monitors, because you wont read it, you to much of a fan, that even seeing rather not to believe

maybe you can't read, how about a vid:
http://www.youtube.com/watch?v=3axAyLLlHUc

All I know is you do a hell of a job, taking everything out of context! 😗
 
For people that hate the 1mm bezel, you can always get 3 Projectors.

You cant argue this as Eyefinity is a nice bonus something that nVidia wasn't able to provide on a single card to reduce card prices thats a +1 for ATi, let say you have 2 old monitors , you can just make use of them at least, its crazy how you can have 3 independant desktops, one monitor displays a Movie, another one for gaming and a third one for online TV and internet browsing. Big company's will get 6 monitors for sure considering how many thousands of $ it will save them. And personally, I watched a movie on 3 monitors and you really cant see the bezels after an amount of time, you will be more into the movie than the bezels. New monitors coming out with nearly perfect bezels.

Now, if there was a way to each desktop to have its own mouse, that would be crazy. Replacing 3 PCs with one.

btw The higher the resolution is, The closer the gap is between both sides. Which is weird because nVidia has more memory at very high resolutions, still fail against the 5970 yet.

You can run pretty much anything on 3x24 or 27 inch monitors 1920x1200, except Crysis and heavy games of course. Crossing my fingers on the 5970 4Gb version which should make use of the 2Gb/GPU at such high resolutions. You can play pretty much 90% of all games.

btw Do you need the adapter if you have a monitor with DP? As I am seeing lots of monitors coming with DP now days from Dell, Samsung...
 

=p and you don't get my point

the point was that a cheaper card like a 5850 5870 wont do 3 monitors too well, frankly well enough to be worth paying their premium.

Frankly i don't care for any 3 monitor solution i made that clear via ati or nvidia.

But to that extent nvidia requiring you to have sli is not that bad. So you link me only vids about a 5970 doing eyefinity remotely well enough to be worth the massive frame drop.

way to not get my point which if you still haven't gotten it.

The GF100 and GT200 requiring people to use sli to have a 3 monitor solution for gaming is hardly an issue considering the frame drop associated with 3 monitor solutions in gaming.

you forget the original part that i was responding to was about using a single card and the cost of eyefinity which frankly i just luled at someone thinking it was so cheap, the only reason ATI's would be better in a cost benefit way is that at the end of the day you can run 3 monitors off 1 card, but in reality you don't want to game with 3 monitors via 1 card. Which is not that impressive considering matrox does that too - any form of gaming ofc lol.

It's interesting when my focus is cost of running and you're response is how it's all about the 5870 can do eyefinity in games soo well! you claiming hard opc's little dive into it was a 5870 denying and being lazy to even pay attention to the video you linked of it when he says he has a 5970, again a 700 dollar card at which you you could easily get 2 470's or 2 5850's or a matrox card that wont game for you but will do 8 monitors the cheaper version of that card not the 2k one lol. And when i point this out you insta change your position and claim i'm a fan? when you don't read the article you linked that did not praise and say it 3 monitor gaming via a 5870 was super it said pretty much hopeful things that one day a single gpu could run it but for now it's a case by case basis not really worth the performance drop. So when i tear down your 5870 eyefinity gaming love you go and link a 5970 and say no porblem you're a fan of nvidia you don't get it. Totally ignoring the first part which was the cost of running. rant rant rant

maybe you should look at your icon and look into a mirror and see who is the fan of what.
 
Izzy, it's mentioned in an article that 3 monitors drops performance by 50%. This is with 76x16 though, or 3x 2560x1600 monitors.

Now since people will usually do 5760x1080 (3x 1920x1080), performance won't be cut in half.

Also, games like L4D, Tf2, CS:S, any MMO/RPG, etc, will be perfectly fine.
 


No offense dude, but people don't usually do triple monitor at all, it is a minority thing right now.

And no, they wont be perfectly fine. You show me a 5870 running games at 5760x1080 at FULL DETAIL with a non crappy fps by today's standards, and you can argue that a single 5870 is fine for Eyefinity. It simply isn't, my friend tried that with his 5870, and he got **** fps.... all the time, unless he mad lowered the graphics settings for games. The average wasn't terrible I suppose, but the minimum frames per second was atrocious.

Try Crysis or Metro 2033 or Far Cry at that resolution with moderate or higher settings and tell me the fps you get with a single 5870.. 'cause it'll be like 10 fps or lower.

Maybe I'm just spoiled and used to seeing 120+ frames per second..... but I'll be damned if I ever go below 30fps again just for higher resolution and less detail.

I'd rather have one larger monitor with higher detail with high fps for hundreds less in $ and no headache over stupid adaptors or special monitors, and I would hazard that the majority of the consumer market would as well considering how many people use Eyefinity 😛
 
You're not getting the point. Let's see a GTX480 do Crysis/Far Cry/Metro on that high of a resolution? Oh wait, it can't.

Also, every game is above acceptable frames. Unstressed we can't see more than 30FPS, but when we hit higher brain frequences (alpha/gamma), we can see much more.

Three%20Monitors.png


 
That example is fine,
The facts originally stated were exaggerated.
Thats the new 2gb 5870 not the standard 5870 at 1gb.
Those #'s are not 3 30# monitors either, at 2560x
So essentially eyfinity 6 cards do 3 monitors at 1920x pretty good.
 


I agree with a few of your points but you have to take into consideration that not everyone plays Crysis/Metro 2033 or Far Cry especially in EF unless they have a dual GPU "massive setup" that will give them playable frames at high resolutions w/ eye-candy enabled.

120FPS is useless because normally this means that you have v-sync disabled (unless you have a 120Hz monitor) so you end up getting distorted images/jagged edges/screen tearing when in fact the whole point of having xAA/xAF enabled is to have none of those issues.

You shouldnt be worried about getting 120+ frames, you should be more focused on minimum frame-rate. The human eye cannot detect anything over 55-60FPS so there is no point.

 
120hz monitors are the good monitors for games with pretty much forced top of the line pixel draw only porb i have with the 120hz monitors is that they are costly something like 250+ and so far i haven't seen on that does a least 1920x1200 only 1680x1050

imo i don't look for 55-60 i look for ~70 fps to consider that my min frame rate pretty much wont be below 60
 


What game will run at a constant 70+ frames w/o any drops? AFAIK you would need lots of $$ for a setup that can achieve such results unless you are playing a non GPU dependant game at low resolutions.

 


I'm not getting the point? Apparently you aren't. I'm not saying the 480 can do that either, I know it can't. I'm saying if you want to be able to play every game at high settings at that resolution and want to get decent frame rates and more importantly frame rate MINIMUMS, you need to SLI or Crossfire. Throw in DX11 games, especially with ATI cards, and Xfire is a MUST. STALKER with DX11 gets the <10fps I was talking about 😛

And forget the 5870 2Gb which not nearly as many people have as the 1GB. I'm talking about the 1Gb 😛, that thing would get eaten by anything graphically heavy at that resolution, I know because, again, my friend tried it 😛

And seeing as we're talking about minorities here of the minority that already uses 3+ monitors, lets talk about people that DO do triple 2560x1600 or whatever high resolution monitors. Try that on a 5870 with max DX11 eye candy and tell me how it goes, even with the 2Gb version. It will get obliterated.

I'm not knocking on ATI's cards, they are good cards, I'm just saying multi-monitor isn't really feasible right now without a multi-GPU solution, and that gets very pricey if you already bought 3 monitors. I also don't see that changing for a while with games like Metro 2033 coming out. So hating on nVidia for not giving their 480 or 470 a third graphics port and requiring SLI for triple monitor is just silly, that is all I'm trying to get across, and shows blaitant fanboyism towards ATI.

Oh, and OvrClkr, human eyes don't see in "frame rates", and the 60Hz limit isn't even a limit, it varies between people and is only applicable if you are flashing stationary images at that frequency. Once objects start moving on the screen, the human eye can still see "judder" and much higher framerates do actually start making a difference. So eyesight is not a good reason to ignore higher fps.

The only reason fps's higher than 60 may not matter sometimes is that most monitors right now only display up to 60Hz. So unless you actually have a 120Hz monitor, anything higher than 60fps wont really be displayed 😛. That is changing as more 120Hz and 240Hz tvs get more popular.
 


Where did you get that from??

I have a 22" 1680x 1050 that caps out at 60FPS but only when v-sync in enabled, meaning if I have it disabled my FPS can go beyond 100 depending on the game.

Oh, and OvrClkr, human eyes don't see in "frame rates", and the 60Hz limit isn't even a limit, it varies between people and is only applicable if you are flashing stationary images at that frequency. Once objects start moving on the screen, the human eye can still see "judder" and much higher framerates do actually start making a difference. So eyesight is not a good reason to ignore higher fps.

When I said that the human eye cannot detect anything above 55-60 I meant in general, so if you game "studders" at that framerate it's most likely the game and not you GPU.

So yes anything higher than 60FPS is a waste unless you are benchmarking 😉.


 


No, I don't mean stutters 😛, I mean judders, aka motion blur. As I said, humans don't see as a slide show, we see motion. So when we we see a frame by frame video and stuff is moving on the screen it can look jittery to us instead of fluent, the only way to solve this is to go above 60Hz which is why 120Hz/240Hz tv's and monitors exist that advertise reduced motion blur technology (http://www.motionflow.net/).

So above 60fps is never a waste unless you are looking at still frames 24/7 on your monitor 😛, assuming your monitor can display above 60Hz, which as I said the standard for most monitors is usually 60Hz-75Hz, but 120Hz and 240Hz is becoming more popular, especially with the rise of 3D movies which require double the framerate of a normal TV.
 
Argh, my brain hurts reading you guys debating this, especially with all the mistakes, but most importantly.....

Try and come back to the GTX470/480 topic not focusing on derailing into the utility of a single feature outside of those cards, because the sidelines are turning into simply a back and forth he said / he said stuff. :pfff:

Something is a feature, it may be a checkbox and have no utility or alot of utility for you, but it's still a feature +/-.

Debating whether it's better to play on Ultra-High on a single monitor vs Very high on 3 monitors is not something you can properly value, and will be more/less important to some people, just like the "AA vs higher resolution" discussions.
 
Well the cards aren't here yet, but these 2nd wave cards have cooling that should lower the gpu's actual temps. Though I would say that more heat will be temporarily left in the case. But thats what exhaust fans are for ! Newegg has one model of gtx 470 that had 4/5 on it yesterday. But thats changed now to 4/9. 🙁

Palit_custom_GeForce_GTX_470_480_01.jpg