Radeon vs geforce

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


if you happened to have a more updated review then post it.

those are the only ones that i found that compared them @ stock. all the review sites that came up with the newer version of ccc happens to test it on a 4850 factory oc'd vs a 9800gtx+ @ stock and vice versa (toms review had a slightly OC'd 4850). assuming nvidia made their 9800gtx+ with their new driver releases weaker you might have a point, and as far as games are concerned with all updated drivers, games that favor Geforce still favors geforce. Games that favor the 4800s still favor the 4800s (cod4,grid, hl2, fc2dx10). (didnt toms made an article about the drivers 8.6 vs 8.8). so why are you being upset mr. mod? you could've posted a more up to date article, if you can find one, instead of you being sarcastic.

if you have the modacity to come up with another conspiracy theory and postulate on CPU-based physics again, feel free to post.

so which is more accurate, comparing 2 cards @ stock, or posting a review with one OC'd the other isnt?



since when? 2007? its not like intel rebuild the whole havok engine from the ground up when they acquired it did they? and why should i care if intel owns havok when it runs on an amd machine? whats the point?

like what ive said, typical ATI fanboys will just see it shiny dust. if those effects are really doable sans hardware acceleration, we shouldve seen something like it with FO3. and i wonder why you're not cursing @ that game for having mediocre havok effects. now if the havok on Fo3 happens to gave you a WOW EFFECT and MEdge didnt. well, thats typical amd fan boy stuff.

if you happened to have seen anything comparable or like the PhysX effects with MEdge before, name that game. trust me, you cant.




how many CPU cores does it take to achieve the same thing? playable thing.



and why would would i be bothered if PhysX/Havok dies once Game Physics becomes standardized with Dx11/OpenCL? did i said PhysX will be the standard of it all? ive mentioned this before, PhysX will be another Glide scenario if this continues.



so you're assuming im a PhysX Fanboy just because i liked the PhysX (hardware assisted) effects on MEdge? since you mentioned god in there, divine intervention is needed to put something in that shell you call a head. now if this proves too trivial for you, you know what i'll call you, bone---- 😀.




really? wow. and the results are actually the same.



nope, thats what Boneheaded ATI fanboys consider as blasphemous idolatry. like what ive said, i never singled out nor ignore anything (which you keep on putting on my mouth that i did lol) when i mentioned Physics as a whole will mature.

@ rangers.

oooh the red "rangers" lives on. apparently the name i branded you suits you well. you still complain about the caviar when you're munching pork and beans?
 
If I need to go out of my way spouting the greatness of DX10.1, Ill also have to know its shortcomings. Yes, its been seen in a couple games where fps go up using it, which would allow for more.....PhysX, but, guess what? nVidia doesnt have true DX10.1, and cant get the full benefits from it, so the lag from using PhysX will be as bad as before, but, again, who wants DX10.1?
I think PhysX is a step in the right direction, but itll soon most likely become a moot point, just like DX10.1, when DX11 comes out. Should I be compelled to go out and buy the 2 games that use full DX10.1? Not really. Same with ME, unless I like the game. Its nice to have a lil more fps or eyecandy, but its not a deal maker or breaker, regardless of which card Im using
 


are you talking on a developer's POV or from a consumers? afaik, developers will need a license to use havok, so i wont call that free. i dont see any GPL license tagged on havok.

from wikipedia:




now if you're suggesting nvidia charges extra for PhysX, i dont remember paying anything from downloading the PhysX driver from nvidia (pre GPU PhysX days, software PHYSX i like in MOH:Airborne). i dont recall putting in my credit card number in there.

and i dont recall paying anything from downloading the forceware with PhysX.

i dont recall the prices of the geforce 8/9s going up after they officially support GPU PhysX.

its a bonus for geforce 8 and above owners, thats it.
 


Nvidia DOES charge a licensing fee to developers to use the PhysX API, and those developers WILL pass that charge along to the consumer when they sell the games.

 
Heres the thing, charging for something per user where theres no real competition is one thing, while charging for a license is somethign different, and baddaboom they charge per user. This is a fukky functioning usage but IMO shouldnt be charged for. If nVidia really wants to shine, and expand, they wouldnt do this. However, with LRB just around the corner, theyre sticking it to anyone who wants it til LRB comes out, as theyve seen the writing on the wall. Ive already said I commend nVidia for their efforts, but this too is a no win scenario for them, and itll be somewhat back to step one once the standards are set. But again, its no deal breaker
 


Which is fine when it's a GTX+ vs HD4850 instead of GTX vs HD4850, or do you think only other people should play fair?
Seriously, it was a simple request without sarcasm until you addes it: Got something that's more recent than the summer that pits virgin HD4K drivers against an EOL product with optimized drivers?
icon5.gif




For the task, probably the one that applies to the cards being tested, for what people would do when buying, probably the one that pits equally priced cards (well even the factory OC'ed HD4850s are usually cheaper) against each other in a current situation, which would be the HD4850 OC'ed versus the GTX+ both with the latest drivers and games.
That seems pretty fair, and pretty straight forward, and there's a bunch of those. Sound fair :heink:



The point is rather simple, which is why you miss it and try to make it into another discussion, like your other ones. You say I'm against PhysX because I'm an ATi fanboi pumping Havok, but it (Havok) is owned by intel, and was previously the darling of nV and ATi, with nV actually being the launch partner for HavokFX with their SLi physics. That's the point, it's not abou being an AMD/ATi fanboi, just because you're an nVidia fanboi, it's about not wanting shiny physics before, during, after the PhysX era, regardless of who brings it to me, and if I were promoting Havok for the sake of Havok, that would make me an intel fanboi, not and AMD/ATi one despite your charges to that effect. That's the point. Pershing following that up with people only using Havok because they were forced to with ATi cards was similar to that reaction by you, it's about people using one or the other because they have to, it should be a moot point because everyone would want to. Sofar, we're nowhere near there. And that Medge's implementation lets you run it via CPU mean it' all available to everyone, unlike what you you said.
Game physics, not shiny physics, that's the other simple straight forward point, and like I pointed out, I'm far from being the only one who feels that way, and I've also never changed that position regardless of the implementation or whomever owned it. I've always been about the bullet-drop, not the bits of crap flying through the air or more ragdolls. :pfff:




Why would I curse a game that doesn't sell itself as being the be all and end all of physics implementations? I definitely having been pimping the physics in FallOut3 or in Oblivion on which it was based on. I had high hopes for it based on their pre-release PR, just like Crysis, but their reality was dissapointing, and just before launch both made it clear, they fell short of their expectations and didn't include the level of physics they had hoped for. I don't think you've seen me praise or woudl see me praise the physics in FO3, I know like Oblivion, they aren't impressive, and they're only a step up from a game like Morrowind, but not necessarily from a game like HL2, so you're essentially saying I should be more appreciative of the debris in MEdge because it's better kind of dissapointing? The only game that impressed me sofar was HL2, and that's because next to no one was at that level, and I didn't care if it was Havok or Novodex behind it; the only thing that that knowledge did was to unrealistically boost my hopes for Oblivion because it was Havok too and Bethesda said they were improving their physics over Morrowind. Which just goes to show you, both can dissapoint.



Why would I want something comparable? I don't that's the point, I don't want the 'bad HDR' / 'shiny phsysics'. You seem to think that it's about that, you go out of your way to drive to physics, but take exception with the fact that we are focus on PhysX being the poor implementation, not being able to split the two.
It's like saying shadows = good , self-shadows = good, Oblivion self-shadowing = bad ; therefore you draw the conclusion, everyone else is saying shadows are bad and they must be Fable or WOW fans. Nah, I've always been in the physics is good, gameplay physics is good, and the future will be nice camp, but I've also remained in the shiny physics is bad camp, and the ownership change of PhysX hasn't changed where I stand on that, nor has it changed the implementation, other than you no longer need to buy a PPU which has no other task, that takes away one fault, but doesn't address the others, while adding CPU overhead as a drawback. It still doesn't speak to 4745454b's statement that a larger number still use Havok, so saying that nV is the future is just an empty statment, especially as we go to the next point...



Because that reinforces the idea that it doesn't matter, and it's NOT the future unlike what Pershing said. You do realize that that was the impetus, not that someone does or doesn't like mirror's edge as a game, or is against physics in general. It's specific to physX, so OCL and DX11 pretty much run counter to that statement, especially since nV isn't abandoning CUDA for them, nor directly changing it for them. You'd have to run a wrapper or change code again to make it work, thus removing the benefits of OCL and compute shaders. Since it's a standard C/C++ library, then the best way would be to go straight through without a seoncd layer, meaning it would be available ot all at that time. That's why it matters that the future is more agnostic than the hardware specific CUDA+PhysX implementation, and also why it runs counter to that original postby Pershing.



Same reason you call other people fanbois and think that since we don't like the shiny physics implementation currently and previous used in PhysX, we're against all physics. If your position were forward looking about all physics like you pretend then instead of trying to talk about maturing PhysX alone you would talk about maturing Havok and how they both would be replaced, but that's not what you did was it? If it was about physics as a whole then the commments about Shiny physics shouldn't have illicited the fanboi comments on your part, because then it would be about physics or no physics, not AMD/ATi/Havok/intel/etc vs nV, or is that me putting words in your mouth?

Seriously, perhaps if you focused less on the Fanboi and name-calling crap and more on the physics, you wouldn't have missed that part. [:thegreatgrapeape:6]
 
seriously. this whole debate is useless. yes physX might look pretty, yes it might hold promise, but only if all sides have access to it!

imo physics will have their place in next gen games, but it will NOT be implemented with nVidia PhysX. seeing as how openCL provides the same (or better) implementation of GPU acceleration, and it can work on both sides of the fence, IT will become the standard.

i like havok, i loved picking a pumpkin off the table and throwing it at a stack of plates in oblivion, just to see the pretty physics effects. but havok is simply not as powerful as gpu accelerated physics is/will be.

this petty bickering/name calling is doing nothing to further the debate, and is getting nowhere to the point. the whole thread is a fanboi trap anyway. "Radeon vs Geforce", seriously.

end of thread as far as im concerned...
 


uh didnt i admitted that i made a mistake not noticing the + on tflame's post?



apparently the question was about performance alone (which is fastest, not which is more reasonably priced). which is already nil since my reply was pretty pointless (since tflames post was about a gtx). apparently you didnt read.

and no particular brand/version was mentioned, in this case, stock cards will be the best comparison, which i posted the stock reviews. in typical fan boy fashion, you brought up the "driver" issue.



im sorry? how did i became an nvidia fan boy? because i liked the PhysX on Medge again? like what ive said, quote one completely nvidia fan boy comment i made.



uhhh, no. because right now havok is amd/ati's only answer to PhysX. isnt that why they're (sort of) working on bring havok to the gpu? and when i mentioned nvidia can use Havok, too. you went ballistic didnt you (you postulated some cpu hardware physics crap)? you failed to realize i was referring to a platform.



seriously, DICE and Nvidia did this? did they actually claim Medge will be the end of all physics implementation?



ohhh, so in the future those PhysX effects can never be recreated on a standardized Physics Engine. good call. absolutely profound. remember how good an alternative d3d and openGL was to Glide?

now i hope this will resolve the issue that im only focusing on PhysX like what you want me to be lol. so that you can make me look like a fanboy.



is this somehow related again to me liking the PhysX on MEdge? can you also define Shiny Physics? apparently i havent seen anything like the Shiny Physics on MEdge. have you (name 1 game please)? and thats why im calling it a step forward, which really upsets you. like what ive said, you're argument with PhysX on MEdge is typical ATI Fan boy stuff.

and again. you want PhysX to be part of the gameplay which no sane developer would do as of today. too bad you're under the impression thats PhysX on MEdge is the final stage for PhysX.



nope, you're the type thats actually against PhysX alone so you're special. you said that those Hardware Accels thats PhysX does can be done on the CPU right? like people are being ripped off for running physX on the GPU.
 


i never said they didnt.

but as far as the hardware is concerned, thats an added value. same with dx10.1 on ati. can live without, but good enough to have.
 


But that's the point, you don't read and you make up your own comparison and you call people fanbois when they don't agree.
Price was mentioned by jberry 3 times, even before you replied, so apparently it's you who didn't read on both counts, the original comparsion, and that he was talking about price ranges. So if the factory OC'ed HD4850 is faster than the GTX+, and less money, why wouldn't you include it? Hmmm, I wonder? 😗



Really, you are naive if you think drivers play no role, especially in new architecture. It helped the G80, the HD2/3K, the GTX2xxs and the HD4Ks. That you would stick to old reviews from the summer eschewing the latest reviews with the latest games shows you have to go out of your way to support your theory, and point to reviews that are out of date and no longer relevant.



Any phrase you had that started with 'typical..', especially equating not liking the PhysX implementation in MEdge as only equating to being a fanboi of the 'other camp', that's typical fanboi, "you're either with us, or against us' prattle.
Also you think that saying someone doesn't care if it's CPU or GPU they just want game physics is 'going ballaistic', when it you resorting to the bolding and capitalizations in your responses. You sure do project alot don't you?



Guess you have a problem with the english language, which may explain alot of your confusion, the be all and end all meaning it's the quinessential, latest and greatest, etc. Check their PR / Marketing, and it's definitely marketed as such. What did you think the split screen was for in the video, and why do you think it only chose the on GPU-PhysX and off models, not the other options?




Now what's your point here, other than acting ignorant? No one said the effect can't be recreated, but going forward into the non-IHV specific OCL/DX11 compute shader future, it offers no advantage, and actually a disadvantage to make it also for Cuda-PhysX versus making it for the agnostic standard. You can do it, but why bother, since it doubles the workload spliting the implementations?



No I leave that to you, your replies do the job for me with your stawmen redirections and your constant return to using the term fanboi as a reply rather than anything of substance. You take it in that direction, not anyone else, as if a personal attack or name calling added any weight to your arguement, and didn't make you look like that which you accuse others of being.
 
Guess you have a problem with the english language, which may explain alot of your confusion, the be all and end all meaning it's the quinessential, latest and greatest, etc. Check their PR / Marketing, and it's definitely marketed as such. What did you think the split screen was for in the video, and why do you think it only chose the on GPU-PhysX and off models, not the other options?


^ OUCH!!!!! LOL 😛

Sorry but that made me laugh

Does any 1 have the Test your might song from mortal Kombat we can play in the background of this thread?😀
 
this petty bickering/name calling is doing nothing to further the debate, and is getting nowhere to the point. the whole thread is a fanboi trap anyway. "Radeon vs Geforce", seriously.

a trap! umm sorry dude but no it was not a fanboi trap. it was a question as stated way way back that i didnt know ati was any good at all. that was coming from a fan of nvidia(me). as for the rest well i dont care. i just care about benchmarchs and specs and wanted to know what would run just about any newer game i plugged into the pc.
thanks everyone
 
well the title is a fanboy breeding ground....Its like shouting "SEX" then asking questions about types of condoms you know?

You should been more general, and undirected post if you wanted straight answers from alot of ppl "what card will best suite me?"or something like that...thats a question.

This thread could be easily a Statement:)
 
i know im REALLY late on this thread...but in more stressing games(crysis, GTA 4, Fallout 3) the GTX 260 performs just a hair better...

IMO if i were u id stick with Nvidia

sorry for being late XD
 


Yeah, I'm sure OP is so torn over this decision he waited 4 months just for your input.
 


Beware! ATI is just a shell for AMD.

CrossFire is an ATI solution to the Dual-card problem, and its better than SLI. You can expect 100% improvements in frame rate with CrossFire. SLI is less than 100% and it does not give you twice the performace you'd expect with two cards. CrossFire is not new; so its a well tried method and yealds the better results. SLI...is a waste of money!

After saying that, AMD has very poor customer-care policies. They seem to have the attitude, "the customer will get what we give them!" They will eventually outstrip Nvidia by sheer production volume, which means "cheaper cards but just as good," but they dont care if they make mistakes along the way - and we the customer will have to take it leave it.
 


On what planet does crossfire scale 100%? Put simply.. you are a moron and have no idea what you are blabbering on about at any point in that 'post'.. Damn necromancers...