Switching from NVidia to ATI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

RabidSnail

Honorable
May 4, 2013
105
0
10,680
I have recently gotten a third monitor and been messing around with NVidia Surround long enough to hate it (About a week). My current cards are slightly outdated (GTX 460) and I have been thinking of upgrading anyway. So im making the switch to ATI Cards. Unfortunately I am under educated on anything to do with ATI. My long term goal is to have Quad Crossfire running 4 monitors. My thoughts were one card per monitor, but have recently been advised that more that 2 cards in crossfire tend to get clunky and dont receive much performance boost. (True?) The monitors run at 1920x1080, and I would be using 3 in Eyefinity, and one above it in extended mode (I might have the terminology wrong, but i think you get the intent of what i want to do, if its possible.) My main usage for my computer is using things such as Revit, CAD, Sketchup, Rhino, and other architectural programs, but I also do some gaming with things such as Skyrim, Bioshock Infinite, and Ghost Recon. Due to the quantity of cards i would like, I wanted to keep it under $250 a piece, but if general consensus is to only dual crossfire, i can push that a little. The date of purchase will be mid June-ish.

I was considering this card, but like I said, i dont know enough to defend why i chose it.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150605

Any advice is appreciated!
 

jdowdy10

Honorable
Apr 12, 2013
82
0
10,640
I agree with nguyenm, two 7950's. My opinion is that the frame metering argument is pointless because of vsync and the ability to edit config files for most games to lower the frame buffer.

Skyrim falls into the second category. Once you edit the ini file to render frames on the fly and user a frame limiter you don't even need vsync. It just plays better as well.

These are my opinions on the subject, definitely not fact, but I only crossfired my cards for a short time and with vsync on, performance was not an issue on a 1080p television (2x7850). All of my games locked at 60fps, and that is all I care about.
 

Jordan Cayco

Honorable
Apr 15, 2013
34
0
10,530
just be sure to look for gpus of any brand for amd that has at least 3 display ports and buy sapphire dvi active mini display port adapter(if it's mini dp) so you wont get those screen tearing on either monitors.
 
I was attempting to let the OP know of the pitfalls of Crossfire before making the switch, so he is aware. The point is Novuake was acting like all the articles are not true, which you all know is BS, as you guys put FPS limiters on. Obviously, limiting your FPS is not ideal, especially in cases like Skyrim, were indoor and outdoor FPS are drastically different, meaning you have to limit yourself to the worst case scenario.

I have no issues if he still wants to go with crossfire, only he should not go in blind of the negatives.
 


What!?
 


If you are getting stutters outside at FPS greater than 60, and stutters inside at FPS greater than 90, you'd have to set the FPS limit to 60 FPS. That is what that means.

(the numbers are random, just so you can follow the concept, and one I've seen talked about by crossfire users in that game).
 

jdowdy10

Honorable
Apr 12, 2013
82
0
10,640


I agree he should be aware, however if he can keep the "worst case scenario" above 60fps (or even 30fps) then everything should work fine.

If the OP intends to build after July, I would wait to see what kind of progress AMD has made and make has decision based on his/her own opinion. There are going to be a lot of flame wars around that time regardless of whether AMD nails it or botches it.

 
if you want dual cards stick with NVidia. general advice for either amd or NVidia is get the single most powerful card you can afford before going dual cards. Average performance benchmarks do not show the glitches with crossfire setups, look at recent FCAT filtered benchmarks for a more realistic view of performance.
 

Eugen M

Honorable
Feb 20, 2013
17
0
10,520
Personally, i avoid anything that is multi gpu or multi gfx cards, be it sli or crossfire, both are quite unsupported and will scale poorly the more you add.
Pick 1 strong gfx card and all that nuisance is solved, if memory serves, ati cards still have better AA and AF than nvidia when it comes to displaying them, ati focuses very hard on improving it almost every gen, nvidia doesnt give a darn about it and leaves it outdated for generations and generations.
Some people will believe that the game is the final authority over how the final picture is displayed over the monitor, that would have been true if a GPU mere job was to output the already processed picture, so 1 piece of advice, dont fall for it, even with the exact same settings the picture can look significantly different from a brand to another (ati/amd to nvidia)
 


While AMD and Nvidia have different strengths and weakness, you are way off on some of that, and clearly sounds like a complete in your face Fanboy remark. If you want to claim Nvidia has poorer IQ, at least bring something to back it up. Btw, Nvidia has added FXAA, TXAA, Adapative-Vsync and boost tech in the last couple years, how is that not giving a damn?
 

Eugen M

Honorable
Feb 20, 2013
17
0
10,520


Hello, it is quite common to confuse image quality with instruction sets or "features", dont worry about it however i dont know why i should do the search for you but ill open a small exception:
http://www.semiaccurate.com/forums/showthread.php?p=80799
This was an older generation comparison. The reason i have posted this was to make you aware that these different processing final output truly exist, it is easelly noticeable pretty much everywhere.
It is ok to and not shamefull to make valid questions, it can benefit the person who asks the questions and the community since this is a public forum, please keep in mind that random accusations to people you do not know will not make a conversation polite or educational.
 


You claimed Nvidia didn't give a damn with inferior IQ, the article I showed, showed AMD had inferior IQ, though it sparked a fix by AMD. How is that not related?

Your whole topic is BS. Show something to prove your theory, other than random talk.
 

Eugen M

Honorable
Feb 20, 2013
17
0
10,520
Hello, because i have politely tried to warn you of your unfortunate attitude to this subject and it still persisted, this will possibly be my last post towards you regarding this subject.

If you do a proper comparison with the bugged drivers and the updated drivers on the same game, you will see no difference in anisotropic filtering quality between the old drivers and the new drivers in the same test that i have showed you above.
According to amd, the updated drivers simply changed a setting within the software so it could display the textures correctly, the new drivers and old drivers did not change the image quality of the anisotropic filtering in the exact same test, it merely changed a setting so certain situations where the texture was not displayed correctly, would, as that was the culprit of software side and not hardware side, furthermore different cards even from amd running the same driver versions still have a considerable ammount of difference in AF quality between an older generation of graphics card and a newer generation of graphics card, more specifically from hd6000 series to hd7000 series.
So i have to repeat, you are confusing a software related bug to the way a hardware processes that feature, according to amd again, by no means the software update degraded the quality of the anisotropic filtering to match the nvidia one nor it did degrade performance as this was not caused by the hardware but by an overlooked software setting.
 
You never showed a hardware problem with Nvidia, or that Nvidia has never improved theirs. All you showed was a post that the 6000 series improved on the 5000 series, and from what I read on your link, Nvidia was better than AMD in IQ, not that it pertains to current hardware.

So again, show something useful. I can say random things about anything. It doesn't make it true.

Note: If you post a link to any professional articles on the topic with current hardware, I'll take it seriously.
 


Stop, please please stop. Really...