7900GTX or 1900XTX

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yeah.

We could speak for ours, get drunk, and wake in the morning dont remembering anything:)

I think Ati and Nvidia is the same thing, same boss, same investers, same profit:)

I dontn wanna to implent amd and intel :lol: in that:)

So Men who started this topic whatever you choose its the same.

You can go to rest now.

GrTz!
 
It's a foreign user who's having a tough time with English, to my understanding.

At any rate...

The PR numbers might be beter but the true contrast on a CRT is better.
The darks on my Dell Ultrasharp are darker than my CRT, and the brights are smoother, like with HDR. That's to my eyes, at least.

I'll put my money on OLEDs over SEDs. SEDs are very interesting and have some attraction, but they should been out a decade ago, with OLEDs coming out you're going to have all the benifits including energy savings.
My only concern with OLED's is the length of life. Also, if the technology should have been out 10 years ago, why didn't you pioneer it? lol.

7600gt faster than my 6800gt? Hrm?
Sure. I built my brother's PC with a N6800GT from Asus. Nice card, but I think the 7600 is a. faster and b. more future-proof.

As far as DirectX 10 cards.... June/July. nVidia 8x00 series.
 
It's a foreign user who's having a tough time with English, to my understanding.

At any rate...

The PR numbers might be beter but the true contrast on a CRT is better.
The darks on my Dell Ultrasharp are darker than my CRT, and the brights are smoother, like with HDR. That's to my eyes, at least.

I'll put my money on OLEDs over SEDs. SEDs are very interesting and have some attraction, but they should been out a decade ago, with OLEDs coming out you're going to have all the benifits including energy savings.
My only concern with OLED's is the length of life. Also, if the technology should have been out 10 years ago, why didn't you pioneer it? lol.

7600gt faster than my 6800gt? Hrm?
Sure. I built my brother's PC with a N6800GT from Asus. Nice card, but I think the 7600 is a. faster and b. more future-proof.

As far as DirectX 10 cards.... June/July. nVidia 8x00 series.
i honestly doubt that considering it has 12 pixel pipes...and pipes are everything for Nvidia cards ~.~
 
flip a coin 10 time and which ever wins go with that

Actually there would be a high probability of a tie.


Exactly.

Performance wise they are about the same. Or within 1-10 FPS of each other.

Keep in mind both of them released new drivers recently which fix problems and improve performance.
 
And a dongle? I don't know why anyone would consider that a bad thing, you attach it and you're done.
I prefer the SLI bridge inside my case to the external dongle. My only reasoning for this would be in a quad display situation where you don't want to unplug and plug your monitors in when disabling Crossfire/SLI.
 
Both have progressed significantly over the years. I remember when SLi had to be with 2 cards that had identical BIOS's and clock speeds.

And there were about 12 profiles, everything else you had to try and do yourself, and no WS support. But that's old news, just like the single link TMDS on the X850CF cards.

That's not bad. However, ATi needs to drop this retarded dangle and master card crap.

Unless they do finally go into the 3+ card mode in which case it's still a better solution than the SLi bridge which can only do 2 cards at once, technically the dongle could daisy chain a large number of cards together continuously, but you need the PEG16X slots to accomodate all the card, thus a board like the Quad by Gigabyte.

But if they don't do 3+ that way then the Dongle was somewhat of a pointless 'stop-gap' venture to wait for the extra PCIe lanes on the MoBos.
 
I prefer the SLI bridge inside my case to the external dongle.

And I'm the opposite since I want everything out of my case if possible (one more thing to create air-current eddies. But really that's as minor and issue as asthetics is.

My only reasoning for this would be in a quad display situation where you don't want to unplug and plug your monitors in when disabling Crossfire/SLI.

Now that's a valid concern, technically and SLi rig CAN support 4 monitors when SLi is disabled, however the Xfire setup will only even support 3 monitors due to the dongle's special 2 way connector on the master card. Not that this would concern me much as I prefer 3 monitors (like surroundview) instead of 4, but anyone wanting to exploit the most monitors they can from the setup would lose that, which kinda sucks.
 
The only way I could see the SLI bridge blocking airflow is in a situation where some sort of VERY large aftermarket cooler is used. I don't think you realize it's less than 1 inch thick, so I can't see it blocking much air. My guess as to why ATI doesn’t use a bridge similar to the SLI, has to do with Nvidia getting their dual GPU setup on the market first.
 
The only way I could see the SLI bridge blocking airflow is in a situation where some sort of VERY large aftermarket cooler is used. I don't think you realize it's less than 1 inch thick, so I can't see it blocking much air. My guess as to why ATI doesn’t use a bridge similar to the SLI, has to do with Nvidia getting their dual GPU setup on the market first.
Let's not forget the resolution limits of Crossfire! Right now its limited to something like 2048xsomething, which is crazy for those who have extreme monitors/setups. I remember when it first arrived it was limited to 1600x1200!
 
The only way I could see the SLI bridge blocking airflow is in a situation where some sort of VERY large aftermarket cooler is used. I don't think you realize it's less than 1 inch thick, so I can't see it blocking much air.

No, no I realize their size (different base on the MoBo actually and some are 2 inches not that it matters as much).

My guess as to why ATI doesn’t use a bridge similar to the SLI, has to do with Nvidia getting their dual GPU setup on the market first.

Well actually from the start ATi had been talking about bridgeless, I tihnk after testing they realized that the PCIe lanes were giving them some latency and bandwidth issues. And from the start they were always mentioning their previous array which involves 32 cards. I think they always intended multiple cards, but found that their MoBo issues held them up as well as likely the software side of things.

Needless to say, neither solution the bridge or dongle is much of an issue in and of itself, but it's limitations (only 2 cards vs only 3 monitors) have a far greater impact to the ustility of the solutions IMO.
 
Let's not forget the resolution limits of Crossfire! Right now its limited to something like 2048xsomething, which is crazy for those who have extreme monitors/setups.

This limit is about the same for for SLi since dual link limits (25x20+@60+hz) is about the same for the 400mhz RAMDACs in the cards, and is well beyond the 30" display standards. So you show me an extreme solution wthat goes beyond that. And with the move to dongle-less then you'd have no hard limit since no transmitter/receiver to worry about. And with the way nv's memory utilization is setup, I dobt anything above 25x20 would be useable with tme either.

I remember when it first arrived it was limited to 1600x1200!

That was 1900x1200 BTW, and I laughed at it because how many reviewers ever benchmark above 16x12 anyways, even now so many reviewers simply benchmark 16x12 at higher AA levels because they don't have monitors that can go that high.
And when SLi fist came out it couldn't do WS for many months.

Both had rough starts, and both sit at about the same current niche-imperfect startus. The only thing that picks the winners IMO is how you use them and which peculiarities you prefer.
 
Let's not forget the resolution limits of Crossfire! Right now its limited to something like 2048xsomething, which is crazy for those who have extreme monitors/setups.

This limit is about the same for for SLi since dual link limits (25x20+@60+hz) is about the same for the 400mhz RAMDACs in the cards, and is well beyond the 30" display standards. So you show me an extreme solution wthat goes beyond that. And with the move to dongle-less then you'd have no hard limit since no transmitter/receiver to worry about. And with the way nv's memory utilization is setup, I dobt anything above 25x20 would be useable with tme either.

I remember when it first arrived it was limited to 1600x1200!

That was 1900x1200 BTW, and I laughed at it because how many reviewers ever benchmark above 16x12 anyways, even now so many reviewers simply benchmark 16x12 at higher AA levels because they don't have monitors that can go that high.
And when SLi fist came out it couldn't do WS for many months.

Both had rough starts, and both sit at about the same current niche-imperfect startus. The only thing that picks the winners IMO is how you use them and which peculiarities you prefer.
http://www.guru3d.com/article/Videocards/337/
"The last negative which I hope ATI will resolve is the alarmingly bad support for Radeon graphics cards. Crossfire simply will not work at all. And even if you have bought a 500 USD X1900 XT you will still be limited towards 2400 x 600 vertical pixel lines. That's just not acceptable in my book. Imagine you spend 3x 250 USD on the monitors and 500 on the card, that's 1250 USD alone to see a 600 line vertical pixels limitation (three times 800x600). No sorry, if you own an ATI card then stay away from this solution until ATI actually have support for it."
 
http://www.guru3d.com/article/Videocards/337/

Link is for wrong page, try this one;
http://www.guru3d.com/article/Videocards/337/2/

Second the reviewer doesn't know what they are talking about. He's got the DVI limit on the dual-link cards, but forgets they still have 400mhz RAMDACs and thus support the same full range as all other VGA cards. His list is particularly ignorant based on the following statement;

"you will still be limited towards 2400 x 600 vertical pixel lines. That's just not acceptable in my book.

It's unacceptable in my book too, and completely incorrect of course.
The 3D window in a 2D desktop may be correct, however based on the 2D and 3D information I'd question that too as it seems more like a cut and past since obviously the actual support is far higher as people have gamed and benchmarked at that level.

There seems to be a driver support issue for windowed 3D applications, but that would be different that what you're stating, because it's definitely not a hardware limit of the parts Xfire or no, which would be the same as the nV cards. In fact read further;

"In short, ATI cards of course can easily support it yet at the moment of writing ATI refuses to cooperate to get this fixed for unnamed reasons."

Sounds like a driver issue, and sounds like a "we don't want to help the competition who hasn't let us use their SurroundGaming solution".

Also the author makes it seem that it could be solved tomorrow if ATi really wanted to, and also sounds like something I or anyone could solve with a few tweaks from the right people on Rage3D or 3Dcenter.

As much as I love Matrox, 3rd party apps (as in appliances/applications) don't dictate the limitations of Xfire itself, just the support for their own brand of whatever. completely different issue than what you were first refering to since the 19x12 is a hardware limit not software.

PS, the author also seems to ignore the hardware limits of DVI when he writes about a possible DVI version of the TripleHEad;

"it would probably allow me to go triple 1920x1200 = 5760x1200 :)"

Where does he think this 5760x1200 source is going to come from? Definitely not nV's nor Ati's mass single link cards (which would barely do one of those panels, not 3). Sounds more like some confusion on his part.
 
http://www.guru3d.com/article/Videocards/337/

Link is for wrong page, try this one;
http://www.guru3d.com/article/Videocards/337/2/

Second the reviewer doesn't know what they are talking about. He's got the DVI limit on the dual-link cards, but forgets they still have 400mhz RAMDACs and thus support the same full range as all other VGA cards. His list is particularly ignorant based on the following statement;

"you will still be limited towards 2400 x 600 vertical pixel lines. That's just not acceptable in my book.

It's unacceptable in my book too, and completely incorrect of course.
The 3D window in a 2D desktop may be correct, however based on the 2D and 3D information I'd question that too as it seems more like a cut and past since obviously the actual support is far higher as people have gamed and benchmarked at that level.

There seems to be a driver support issue for windowed 3D applications, but that would be different that what you're stating, because it's definitely not a hardware limit of the parts Xfire or no, which would be the same as the nV cards. In fact read further;

"In short, ATI cards of course can easily support it yet at the moment of writing ATI refuses to cooperate to get this fixed for unnamed reasons."

Sounds like a driver issue, and sounds like a "we don't want to help the competition who hasn't let us use their SurroundGaming solution".

Also the author makes it seem that it could be solved tomorrow if ATi really wanted to, and also sounds like something I or anyone could solve with a few tweaks from the right people on Rage3D or 3Dcenter.

As much as I love Matrox, 3rd party apps (as in appliances/applications) don't dictate the limitations of Xfire itself, just the support for their own brand of whatever. completely different issue than what you were first refering to since the 19x12 is a hardware limit not software.

PS, the author also seems to ignore the hardware limits of DVI when he writes about a possible DVI version of the TripleHEad;

"it would probably allow me to go triple 1920x1200 = 5760x1200 :)"

Where does he think this 5760x1200 source is going to come from? Definitely not nV's nor Ati's mass single link cards (which would barely do one of those panels, not 3). Sounds more like some confusion on his part.
I never said it was a hardware problem, I think it just stupid that ATi fails to cooperate and support it.
 
I never said it was a hardware problem, I think it just stupid that ATi fails to cooperate and support it.

I agree with you there it's very stupid (not like they have a lock on sales like nV had during the early GF7 era).
But still that being said, it's got little to do with the benifits of X1K versus GF7 other than less than 1% of 1% of consumers IMO it would be like comparing R2VB or Smartshader, not enough to sway anyone, and not even a tie breaker for the majority of the buying population.
 
The only way I could see the SLI bridge blocking airflow is in a situation where some sort of VERY large aftermarket cooler is used. I don't think you realize it's less than 1 inch thick, so I can't see it blocking much air.

No, no I realize their size (different base on the MoBo actually and some are 2 inches not that it matters as much).

OK, now you're confusing me. It sounded like you were prefering XF's outside wires to SLI's inside bridge because of eddies? Now what are you talking about regarding "different base"? I mean, really, the SLI bridge eddy is not going to be a significant factor in case airflow.
 
I never said it was a hardware problem, I think it just stupid that ATi fails to cooperate and support it.

I agree with you there it's very stupid (not like they have a lock on sales like nV had during the early GF7 era).
But still that being said, it's got little to do with the benifits of X1K versus GF7 other than less than 1% of 1% of consumers IMO it would be like comparing R2VB or Smartshader, not enough to sway anyone, and not even a tie breaker for the majority of the buying population.
When you pay $1000+ for two graphics cards, you damn well better be able to do high resolution if you please.
 
OK, now you're confusing me. It sounded like you were prefering XF's outside wires to SLI's inside bridge because of eddies?

Yes, airflow period. I like less in my case, my desktops were cabled off, spent extra money to buy single ATA cables instead of longer than I need, exverything it about airflow, and the inch thick 2 inch long piece of additional hardware creates eddies. Is it enough to care much about, not more than a jumble of cables at the back of a PC, and that's my point. BOTH are negigeable issues for the form factor itself, it's the OTHER concerns that are more of a true limitation to the design itself than the BS esthetics of dongle versus bridge.

Now what are you talking about regarding "different base"?

Not every SLi solution is the same, some require longer bridges than others. That why the universal ones are flexible ones.

I mean, really, the SLI bridge eddy is not going to be a significant factor in case airflow.

And really a dongle matters what once it's tucked in the back with the 7 other cablesI have? That's my point exactly. However you'd be surprised at what can and can't mess with your airflow, not that it likely matters to the true enthusiasts since they should be watercooling anyways, and then once again have a bucnh of other cables/pipes to worry about.
 
Do you guys really own a 7900 gtx or 1900xtx....? I have a 7800evga ko and I can make it crawl on oblivion..These cards can slow down too to unlikeable levels.. Im fetup paying 500 bones for a card and then a game comes out a month later that taxes the shit out of it.. I liked playing games at 100fps or more..but when ya turn up the eye candy on the newst games its just sad that 500 bucks aint shit
 
When you pay $1000+ for two graphics cards, you damn well better be able to do high resolution if you please.

You can, but if you buy a $200 add on part, it's up to the add-on people to get the support for their feature. Ati can support the exact same gaming resolutions @ 3D fullscreen, despite what that Guru3D review says, what they are limited with is the driver support for that competitor's part. The same argument could be made for I bought $1000 worth of nV/Ati graphics cards why can I SLi/XFire on a VIA mobo?

It's still not a limitation of the primary solution, it's a limitation in the case you give of the 3rd party product not working as advertised for it's $200 cost.

You want to change your FOV and your WS resolution you can do that on it's own, you want to use Matrox's part then you'll need to wait for Matrox and ATi to play nice. It's such a minor issue, just like Sli being unable to currently use different class cards (no mixing GS with GT). Doesn't need to be the same bios anymore, but must be same pipes. But for anyone wanting that solution, if you must buy today, maybe buy nV, and if you can wait 'til there's a tweak for Xfire, then visit it again. Once again, using THAT 3rd party issue is going to the extreme left-field to say the least.
 

Latest posts