Patch Makes Nvidia Play Nice With ATI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

B-Unit

Distinguished
Oct 13, 2006
1,837
1
19,810
[citation][nom]captaincharisma[/nom]i don't know how long you people have been in the computer building thing but i mean its common sense not to use 2 different products when your trying to do something like SLI/crossfire i mean its like if your buying a set of speakers for your stereo where you buy a Kenwood for the left channel and a Sony for the right channel. there not going to have the same quality and possibly may not sync up correctly.so it is RARE cause only people with no common sense or people who desperately want to have a good system but no money blindly do this[/citation]
One tard mentions being able to mix-n-match for multi-GPU and you want to target everyone.

What most of us are talking about is owning G80/G92 cards and upgrading to 48xx or 58xx cards and now not being able to use the old card as a PhysX accelerator.
 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
Of course, the setup of having both Nvidia and ATI GPUs in one system are exceptionally rare

Acecombat up above has it right on the head with this issue. The same can be said for those that had say an ATI X1950 and bought up say a GTX 285 as an upgrade. If it was ATI that snatched up Physx and the tables were turned, it would be the same scenario. People switch sides all the time for their next upgrade. In this example, that X1950 would still be plenty powerful for some things, and selling used cards really doesn't put much money back into one's pocket. So it makes better sense to put it towards a beneficial use. So what it really boils down to here is if you show your loyalty to the green team, then you will be officially rewarded and allowed to continue to use their older cards. I totally understand Nvidia's move, but you the consumer forked over the money to buy their card, so you should be able to do what you want with it.

Many thanks to GenL of the NGOHQ Forums, that is until the big N and the lawyer team make contact with the guy.
 

atomdrift

Distinguished
Oct 29, 2008
9
0
18,510
[citation][nom]warezme[/nom]Oh please, all you ATI guys can put down your BURNING Dirext 11 crosses. It's not a holy war Mr. Jahad. Because it is so important to the 1% of ATI owners who can actually afford the 5870 to play the less than 1% of games available with DX11.Somebody wake me up when DX11 becomes relevant.[/citation]

First of all, it's "Jihad." Secondly, what makes you think DX11 is the only reason anyone would want a new ATI card?

Why don't you pull some more fake statistics out of thin air to help lend credence to your perceived point? Anyone can make up statistics to serve their own ends. Fourfty percent of all people know that.
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810
Is it true that 8xxx series NVidia cards work for PhysX? I was under the impression it was only newer cards but I have to admit I never really paid much attention. I have a 8800GTS 320MB card sitting in a box literally doing nothing. If this patch works and I can use that card for PhysX that might be exactly the reason I've been looking for to get Windows 7.
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810
[citation][nom]atomdrift[/nom]@purplerat: Yes, all GeForce series 8 and newer card can take advantage of PhysX.[/citation]
Thanks. Right after I wrote that I went to check and found out it would work. Now it's just a matter of seeing if this patch works (I have a 4870) and if it's actually worth the hassle + the possible need for a new PSU.
 

atomdrift

Distinguished
Oct 29, 2008
9
0
18,510
[citation][nom]purplerat[/nom]Now it's just a matter of seeing if this patch works (I have a 4870) and if it's actually worth the hassle + the possible need for a new PSU.[/citation]

Interesting you should say that. I honestly don't know what the big deal is about PhysX. Statistically speaking, I suppose from a gamer's point of view, having a game written to make physics calcs on the GPUs hardware, instead of software such as Havok, might allow for more in-depth physics calculations and faster performance. But is it really noticeable? I suppose in some cases it might be. But I'm just not convinced that it's worth all the hype it gets.

Maybe the only time it's really worth the fuss is with some industrial design applications written for CUDA? Should gamers really give a rats ass?
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810
That's what I'm trying to figure out, is it even worth it or not? However every where I look all I tend to find is a pissing match between AMD and NVidia fanboys.
 

Spanky Deluxe

Distinguished
Mar 24, 2009
515
7
18,985
[citation]the setup of having both Nvidia and ATI GPUs in one system are exceptionally rare[/citation]

What about people using nVidia motherboards with integrated graphics chips? A motherboard with a 9400m built in and a 5870 for graphics could end up being a fairly common solution. The 9400 could be used for PhysX calculations and the ATI card for the graphics. Looking at it this way, it would make more sense for nVidia to make sure it is still enabled so that people looking for the fastest GPU configurations might still be swayed to at least buy an nVidia chipset based motherboard.
 

iocedmyself

Distinguished
Jul 28, 2006
83
0
18,630
[citation][nom]warezme[/nom]Oh please, all you ATI guys can put down your BURNING Dirext 11 crosses. It's not a holy war Mr. Jahad. Because it is so important to the 1% of ATI owners who can actually afford the 5870 to play the less than 1% of games available with DX11.Somebody wake me up when DX11 becomes relevant.[/citation]

oh like the nividia fan boys were doing with the 8800 and dx10 3 years ago? you don't even have to consider dx11 in realizing nvidia is royally screwed right now.
2.8 Teraflops computing power on a single gpu. A 16% performance boost compared to a dual gpu 4870x2, while also dropping load power consumption below that of a single 4870 card and idle power by something like 70%
Six 2560x1600 displays off a SINGLE CARD. (7680x3200)
As of today that means a possible 10.4 teraflops processing power with a possible 24 monitor desktop array 6x4 config would put it at 15,360 x 6,400.
5870x2 will be coming out in a couple weeks, putting a single card up to 5.2 teraflops.

Nvidia tried to passoff a card that didn't even have the molex power connectors soldered to the board while giving a "demo" of the the gtx3** because the damn cards are having nothing but problem after problem in production and design. They are 6 months minimum from getting anything on the shelves and either it will be so horribly overpriced that it won't get bought or they will try and drop the price to compete with ati in which case they will just go bankrupt.

But yes cling to the fact that dx11 is all ati has over nvidia.


PhysX is just another one of those features reserved for bragging rights in synthetic benchmarks, very frew games support it and i havne't seen much software being developed other than what nvidia spouts about the "potential" It could be great, but it wasn't before they pulled this crap and even less so now.

This came about more for the fact of intels motherboards since as someone linked the lucid chipset lets you mix & match. They get to screw both ATI and intel in one move while screwing themselves even more than they have just with their products.

I used to love nvidia gpus, but then x850xtx came out..at which point they were putting alot more effort into chipsets, which were awesome for awhile...and than the Crossfire 3200 chipset came out and they didn't look quite so good anymore. It's not always a matter of playing favorites but being realistic, they don't have anything going for them that is unequivically good.
 

tester24

Distinguished
Jan 22, 2009
415
0
18,780
[citation][nom]climber[/nom]As far as suing nvidia because one of their core technologies is not compatible with a competitor, well... too bad. nvidia should be able to do whatever it wants with it's intellectual property. PhysX is just another feature of nvidia cards now days. There could come a point where intel disables your ability to have PhysX engines working when on their hardware because they support and own Havok, since those two technologies compete and we know intel likes to drive the competition into the dust.[/citation]

So you're saying it's of for a company do deliberatly stop you from using their product to the fullest if you are also using on of their competitors product? That's just dumb especially because if I was a lawyer I would see this as hindering competition, making you have to buy a newer and most likely more expensive Nvidia card vs the ATI brand because you want PhysX.

And it's not that they were always incompatable Nvidia did this on purpose!

I thought this whole thing with one technology not working nicely with technology was done back in the day when you had limited choices on what video card played nicely with a sound card.
 

kronos_cornelius

Distinguished
Nov 4, 2009
365
1
18,780
[citation][nom]Renegade_Warrior[/nom]Strike 1 for the Good Guys! Back to the Time out bench for nVidia[/citation]
I too want to keep my 8800 GTS for PhysX, I guess the author forgot the audience for this website. I will stick with Nvidia though, I'll just enjoy the lower prices when they show up. I do the same for CPU's, I get AMD's. Sticking with AMD is also out of consumer loyalty, I still remember the years when it was beating Intel's Mhz with a better design. I like Intel too though, I have some computer with its chips.
 
Status
Not open for further replies.