Radeon vs geforce

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


i think you're missing the whole point, what if ATI finishes its very own Havok-based GPU-Accelerated Physics ENgine (which they are actually working on). Now that will make Physx and nvidia stuck in the past? Theres also this thing called OpenCL, which would probably standardize Physics Acceleration from both camps.

PhysX is good, but making it the most crucial factor in buying a graphics cards is a little pointless.
 


so4mky.jpg
 
no but stereo 3d is generally a brekaing point for me, I couldnt do without it in single player horror games like doom 3 or bioshock where not a lot of action occurs but its all about atmosphere.
 
i do have another question i seen in another forum "Wait until Cat 9.1 to see if they work first, before you buy an ATI card. " did the 4870 have issues with their drivers then? im a bit baffled cause i havent of anyone saying that they had.



this is another quote from someone saying he had no problems with the drivers "The HD4870 BEATS THE STUFFING OUT OF THE GTX260 any model of it

PROOF >>> http://www.anandtech.com/video/showdoc.aspx?i=3415&p=3

it even pulls ahead of the GTX280 why do people seems to act like its not true.

i had the HD4870 512MB when it first came out the drivers where fine for me i guess alot of people have no luck. "

i didnt mean to make this a blood bath on cards i was just wanting advice. it was stuck in my head that ati sucked and i never looked at any specs of the cards or even benchmarks cause of it but im glad you guys pointed me to them. but now its more what i can in performance for the money and im thinking it is in the 4870. as for all the other stuff on 3d stereo i dont care cause i know i cant afford it for awhile to come if it even ramps up. and even then i may still not get it.
thanks joe
 


Actually, no nVidia doesn't have Havok too, and AMD does use something for hardware acceleration in Havok, it's the same thing nVidia can't use... A CPU.

Intel and AMD make CPUs, and use Havok. nVidia makes GPUs and a weak mobile processor, and only uses PhysX. Now both AMD and intel could use PhysX should nVidia decide to license it, but there is no processor nVidia has that's X86 compatible that could run Havok.

But does it matter to a rig that is so system limited any kind of physics would cripple it?

sites that pitted a stock 4850 against a stock gtx+...

Got something that isn't from the summer at the HD4K's launch with day-old drivers, and similarly priced?
 



Nah, it's about the performance boost the Cat 9.1s provide, just like nVidia's driver refresh boost the GTX's performance.

It usually happens about 6 months after launch when the companies have enough feedback from users and testing to be able to optimize their drivers for games and their new design.

Here's a look at the cat 9.1s people are talking about (site's German), the boost is limited to a few DX10 games, like Call of Juarez, Crysis, and FartCry2;

http://www.computerbase.de/artikel/hardware/grafikkarten/2009/bericht_ati_catalyst_9x_beta/7/#abschnitt_call_of_juarez

The thing is, don't count on driver boosts, but be aware of them. Only once they are shipping and most reviews see performance gains should it be included in your decision process.
 
ok stupid me i should have checked but this is a really dumb question probably for some of you but pci e 2.0 can i use that in my pci e 16 slot? manual says nothing about 2.0 though.
if not then back to the drawing broad on everything. awck!
 
The only issue might be for the second slot. Check with MSI and make sure it's not one of those slots that defaults to PCIe 1.0 spec, that would be an issue. Some early dual PEG boards from ASUS and MSI had issues with the new cards, but there should be sufficient info on theri site about it if it is one of them.

Here's the ASUS FAQ about it;
http://support.asus.com/faq/faq.aspx?no=DE05A01F-3A3B-A859-177B-6A668A2D6982&SLanguage=en-us

Dunno which MSI boards are affected, but do remmeber the majority of issues being ASUS and MSI

The chipset is listed as one that is PCIe 1.0a, so it might be an issue.
NVIDIA MCP55 = PCIe 1.0a Motherboard
 
i know i have two pci e slots and one pcie 1.0 slot. why they have a 1.0 im dunno.
so i should be ok ?
this is what it lists on my specs.
2 x PCI Express x16 slot
2 x PCI Express x16 slot
2 x PCI Express x1 slots
3 x PCI slots
 


i was actually referring to the time when you're actually playing a videogame that supports havok. if you have an nvidia platform you can run the game can you not?

on my understanding, if you run graphics from the CPU, you cant call that hardware acceleration.

and if you run the PhysX drivers without a PhysX hardware, you cant call that hardware accelerated.

now if you're saying havok on the cpu, is AMD/ATI's way of Hardware Physics Acceleration..thats a little messed up.






uhh dude, what are you starting here? please read back to my previous posts, youre making me sound like an nvidia fan boy lol.

like what ive said, my tally post was actually useless since turbo was referring to a gtx not a gtx+ (performance wise). my mistake that i didnt noticed the "+" there.
 
regarding stereo 3d, it's actually beena round with nvidia for some years, now. It isn't relaly that new and ati never adopted it. The problem with it is that lcd's still don't have the hertz in general to make it truly immersive, and there hasnt really been a marketing campaign for it msotly due to costs of good head mounted displays. It is not at all unlike the very beginings of personal computers, really, in the late 70's to early 80's only the enthusiast had their own PC's, much like stereo 3d wtih head mounted displays or shutter glasses now.
 


Which physX game can you not play on an intel and AMD platform?

on my understanding, if you run graphics from the CPU, you cant call that hardware acceleration.

Sure you can, it's not GPU or PPU accelerated/assisted, but it's most certainly hardware accelerated. The question is how it scales. If it supports multi-core/threading then adding more adds to the power. Move from

and if you run the PhysX drivers without a PhysX hardware, you cant call that hardware accelerated.

Why would PhysX be a determinant of it? That would be like saying if you don't run an intel GPU you can't call it graphics.
Hardware accelerated is neither a PhysX invention nor exclusive. What about PhysX on the PS3, does that count?

now if you're saying havok on the cpu, is AMD/ATI's way of Hardware Physics Acceleration..thats a little messed up.

Actually that's intel's way of hardware acceleration right now not AMD's as intel owns it, and how is that any more messed up than your definition of hardware acceleration being limited to PhysX?

The main issue is, does it result in a better experience?

Sofar neither has provided much of anything to really be proud of. Shiny physics in games like Mirror's Edge is like making transparent chess pieces, does it make the game of chess better or is it just fluff? Mirror's edge just added a bunch of paper, plastic and glass to go, 'ooooh physics'. M'eh, just like adding bad HDR to a game.
Gimme game physics that matter, I don't care if it's provided by an 80 core CPU or an 1000 'core' GPU, I want the game to be influenced by it, not just the visuals. Essentially the only thing done sofar is like adding a 'physics' texture or bump-map to games, not actually integrating it into the game.
 
"Sofar neither has provided much of anything to really be proud of. Shiny physics in games like Mirror's Edge is like making transparent chess pieces, does it make the game of chess better or is it just fluff? Mirror's edge just added a bunch of paper, plastic and glass to go, 'ooooh physics'. M'eh, just like adding bad HDR to a game. "

so why improve graphics at all? Its all just floff, I mean the fps genre hasnt really advanced in any way gameplay wise from the 90s outside of graphics imo, look at blizzard they make diablo and starcraft games look almost the same but try changing gameplay, why havent they done this with the fps genre and most other genres?
 


uhh, did you even try playing a PhysX game on a PhysX accelerated hardware? notice the difference? notice the "acceleration" that im talking about?

which is better?




Nobody said about hardware acceleration was invented by physX. apparently Hardware PhysX Acceleration is different from Hardware Physics Acceleration.

as long as theres no separate peripheral intended to do the physics/graphics/sounds processing, we coin the term "Software" in there dont we?




am i not getting through. fire up a game that has havok on it regardless of the CPU/GPU. will be you be able to run the game? yep

now fire up a game that specifically uses PhysX on any CPU, but on an AMD card. yep it'll run, but without PhysX Acceleration. now if you're still going to argue that non Accelerated PhysX (software mode)is the same as Accelerated PhysX (hardware) be my guest.

now if by chance you werent impressed on how PhysX works on Mirror's edge, well thats typical AMD/ATI fan boy stuff. in my 9 years of PC gaming i havent seen anything like it, its like seeing the doom3 video + 9700pro @ techtv all over again.

the reason why PhysX dont affect gameplay yet (just my hunch), is because not everybody has a geforce card. DICE isnt that dumb to just focus on a single set of customer.

you just cant help but be excited on how Physics as a whole will mature especially with Dx11 (hopefully) or the OpenCL initiative, even if some people wont admit it.
 


Gameplay is one thing, but remember interactive physics is what Ageia promised, and still hasn't delivered with PhysX. All graphics makers (from AMD to S3) ever promissed was shinier graphics, higher resolutions, and faster speeds.
They delivered that, they didn't promise a change in the game, that's not their area.

Now both physics and graphics can help in the immersiveness in a game, but that's different than what was put up on offer., and what's being offered currently sure isn't worth getting all worked up about as if it's the same level of game changer promised.

The advances in other GPGPU areas are impressive and live up to their promises and most people's expectations, if not people's fantasies. That to me is the difference, PhysX is still early, it's not bad, but it's not the be all and end all the PR machine is trying to sell it as.
 
edited to close quote.


Yes, I've played on both, from the early PPU days, and even the PPU emulated hack running on a CPU. And that's the point, if it's just the improved performance, regardless of whether it due to an add in card or due to additional CPU cores, it's accelerated. Which is better, depends on your idea of better. Some games do have a nice benefit to them, others not so much. Like I said it's like HDR, some look good with it some look unnatural, where you're essentially playing a game in a debris storm with randomly placed 'make it look physic-y items'. To me the question is, whether or not the implementation sofar is really a full order of magnitude better than implementations like HL2 and Crysis. And sofar to me they are not, especially when it's biggest games use other engines to drive the game physics (GRAW = Havoc, UT3 = Epic's own engine).

now fire up a game that specifically uses PhysX on any CPU, but on an AMD card. yep it'll run, but without PhysX Acceleration. now if you're still going to argue that non Accelerated PhysX (software mode)is the same as Accelerated PhysX (hardware) be my guest.

It'll run with PhysX acceleration too, running PhysX through the CPU worked best before Ageia and nV hampered it, but it showed that a more powerful CPU could be of more help. Even now PhysX is still highly CPU dependant so how much is it's 'software based' like you say;
http://www.fudzilla.com/index.php?option=com_content&task=view&id=8862&Itemid=34

now if by chance you werent impressed on how PhysX works on Mirror's edge, well thats typical AMD/ATI fan boy stuff.

Sure, that's it, side-by-side you're telling me it's that impressive a little plastic sheeting and some additional bullet particle sprays? This is nothing like D3 which made a huge difference in the use of light, shadow and reflection. This is far far from similarly ground breaking, heck GRAW & GRAW2 were much more impressive.

the reason why PhysX dont affect gameplay yet (just my hunch), is because not everybody has a geforce card. DICE isnt that dumb to just focus on a single set of customer.

Now, who's being the fanboi? You're telling me it's so tough to implement it, but that it's so game changing? C'mon you can't have your cake and eat it too.

you just cant help but be excited on how Physics as a whole will mature especially with Dx11 (hopefully) or the OpenCL initiative, even if some people wont admit it.

I'm more excited about the future, because that's where it will become worthwhile, whatever implementation becomes the predominant one, but for now it's not there yet, and that's what you seem to have trouble admitting, as if criticizing the current implementation is the same as criticizing the idea and the future of physics. I criticized it when it was Ageia, and I criticize it now that it's nVidia, and I'll criticize it if it's something else, as long as it remains the fluff it is right now.

Gimme what Ageia promised 4 years ago, and claimed was their major advantage over the competition when they launched 3 years ago; THEN I might start signing it's praises, until then it's just a 'good idea' without a good implementation.
 


uhh before you go ballistic on me mr. mod, wouldnt that be pointless if the game developers would develop a GAME ENGINE "right now" that will use that kind of hardware? considering how much an 8 threaded machine from intel costs right now, thats one of the silliest post ive read in a while.

and i think everybody would care about it too, unless you're the paris hilton of PC hardware, where you can afford something like that. it dont even exist yet.



uhh no, im just being realistic. thats the same reason why you're still not seeing "Dirext10 Exclusive" titles. from a business standpoint alienating one customer subgroup would be a little dumb would it? or do you have another plan? if you happened to have a degree in marketing and might know something i don't, id be more than happy to learn from you.

you can cry to ATI why they dont have PhysX on our cards (but i'd be more than willing to wait to what they can do with their own method).

you sort of remind me of someone here. can never think outside of the box lol. look @ Physics as a whole I said, not PhysX alone.

and if you carefully read the ending part of my previous post, notice the words "will mature" , in grade school terms thats future tense.

hope this time it gets thru.
 


First, what does being mod have to do about it? Don't bring it into it unless you want me to use it to warrant that. :heink:
Second, it's no more pointless than it is to develop software for any other limited implementation, it's not like we're moving to less cores from here.
The point I'm making, is I don't care which one brings it to me, intel's Havok CPU based solution, or nV's GPU based PhysX solution, or intel's yet to be implemented Havok FX GPU solution or a 3rd party's own personal engine (like we hoped for in Crysis and they original said they would bring). As long as the end result is more than shiny physics, I don't care how it's done, as long as it's game physics, which is the area we're lacking and needs realistic physics. The debris physics could easily be pre-determined and people wouldn't notice the difference between realistic arc and pre-calculated arc.

uhh no, im just being realistic. thats the same reason why you're still not seeing "Dirext10 Exclusive" titles. from a business standpoint alienating one customer subgroup would be a little dumb would it? or do you have another plan? if you happened to have a degree in marketing and might know something i don't, id be more than happy to learn from you.

It's not realistic to say it's so important that it's a requirement, and oh no one else can do it, and then at the same time say oh it's not important enough to limit your customer base. It's not like making DX10 exclusive, it's like not adding DX10 until everyone had it. Implementing the debris physics is still a limitation to play as well in UT3, so your argument is moot, because it doesn't work that way either, so it's not like one is easier to restrict than the other.

you can cry to ATI why they dont have PhysX on our cards (but i'd be more than willing to wait to what they can do with their own method).

I don't want it on ATi or S3 or intel if all it has to offer is what's currently being offered. I want nVidia to improve the implementation and stop wasting money on the PR and advertising of this and spend it on the R&D and implementation of it.

you sort of remind me of someone here. can never think outside of the box lol. look @ Physics as a whole I said, not PhysX alone.

and if you carefully read the ending part of my previous post, notice the words "will mature" , in grade school terms thats future tense.

hope this time it gets thru.

Nah, you're the one saying that there's some special advantage to it, that makes it a must have now which is the point Pershing was trying to make, and 4745454b was countering and then you comment on as if PhysX isn't also the underlying engine as well, when he mentions that Havok is more popular, you just focus on the hardware acceleration aspect. It has nothing to do with maturing, they both are maturing and promise better futures, but their current implementations are equally bland since NEITHER offer game physics GPU acceleration in major titles, PhysX is GPU debris addition and Havok is CPU game physics without a shipping GPU solution. You want me to consider PhysX maturing, yet you want to ignore that Havok is maturing too, and might bring GPU physics as well. You want to talk about future tense, then I'll go with the "in the future game physics will matter, right now it doesn't."

So for your grade school quip, a pile of garbage debris is more than nothing, but it isn't necessarily better than nothing for everyone.
 


I have to agree, I've been playing mirror's edge and tried it with PhysX enabled. The effects were pretty cool TBH and I stopped a couple of times to admire them. Though at a certain point my fps dropped to almost 10 (I guess the "fog" effect rapes the CPU implementation of PhysX) so I ended up turning it off, wasn't a real loss enjoyment wise and I gained some performance as well.

PhysX was cool but it didn't really add anything to the gameplay or atmosphere. I would imagine in some of the later levels (where you're generally under more pressure), you probably wouldn't even notice the effects because you're too busy playing the game. In contrast the "realistic" lighting in Doom 3 was essential to the atmosphere of the game.

The main problem I have with PhysX is the support or lack thereof. Nvidia is just trying to push it so that their cards will have an advantage, so it seems more gimmicky than anything. I wouldn't feel any different if ATI started pushing DX10.1 onto developers just so they can say "look what we can do" at the cost of compatibility with Nvidia cards. Havok on the other hand is compatible with both Intel and AMD so nobody is missing out or has to choose between companies based on certain features rather than price/performance.
 


just to annoy you 😀. is that bad?



the limitation being Physx is owned by nvidia, next the hardware isnt there yet. didnt that occured to you? i think it didnt. like in your previous post:



so if you're like game developing that wouldve been hard wouldnt it? like what ive said, if you happened to have a work around regarding that matter, i'd be more than happy to learn some marketing strategy from you.



1. Typical AMD/ATI fan boys see PhysX as dust/debris
2. Typical Geforce Fanboys see it as the savior of the geforce.

well i see it a step forward, like what Ive said. Physics becomes more interesting and exciting as it matures. (compare max payne 2 physics to anything as of late)





i think it failed to get through.

again, NO ONE ELSE CAN DO IT because PhysX is owned by nvidia. now if im entirely coding the game to use PhysX that will require a geforce as a defining factor in terms of gameplay, that'll be crap. i'd be alienating the (fan) boys with ATI and therefore i'll lose potential customer. and if you cant relate the Dx10 with that i cant expound furthermore.




nope, your inner fanboy wants it to be ATI.



theres a difference between running PhysX from a Physx Enabled GPU versus on a Non PhysX enabled hardware is there not? is there? thats the point im making which failed to get through to you.



????, so if im maturing i have nothing to do with maturing unless im maturing and promise a better future, in which case maturing is synonymous as to having a better future because if i dont have better future im immature, or i should become ATI to be mature.



im sorry, who's ignoring who?

"Physics as a whole, Physics as a whole.." eh? did i said PhysX alone? apparently when im referring to Physics im referring to havok and physX and any other physics implementation out there (errr notice the word whole?) . i dont know why you're so stressed about it.
well let me quote myself on that which you might've missed.



happy? you're not a rock are you? notice how i separate physics and physx from my posts. a grade schooler would most probably notice the difference in spelling alone.




who said it isnt?



if ati has its name on it you might've jacked off from the mere sight of it 😀. and besides you can opt to not buy a geforce anyway, nobodys taking that freedom away from you.