Radeon vs geforce

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Never compare number of shaders, clock or memory speed between cards that are not the same generation.


didnt the two cards come out around the same time to compete with one another?
i seen three benchmark test on the two cards i listed above. and they are about equal on the benchmarks depending on the game as in unreal tournament III the gtx+ won and like in call of duty 4 4850 won. but that was just the updated version of the cards(o/c style). the accual vanilla 4850 was left standin in the low margin on the benchmarks compared to the two cards i listed.
i dunno ill keep checking and comparing. i have a little more time before i pull the trigger and buy one.
 
most of the reviews on the net came around the time when the 4850 came out. at that time a gtx (not gtx+, it cost around 220$) cost around the same as a 4870 (around 200$+) and the 4850 came in @ 180$+.

most people just look @ the conclusion/final words section. some of them just look @ the graphs.

the 4800's success isnt about the cards complete performance domination of the geforce, it didnt. its more of the price the 4800s are being offered at. some people forget that.

 
yeah i did notice that i can get a 4870 at $200 so i may do that instead. ive read alot of reviews and they are a good card.
the iceq4 was over 200 and the gtx+ was under 200 at like 174 and the 4870 was 199 and that is suppose to be better then the other two i have listed.
can you o/c the 4870 and get accual improvements out of it?
 
@ your target resolution OC'ing is not that necessary. but of course oc'ing will always yield improvement vs a stock configuration. the only caveat is oc'ing is not an exact science. experience may vary from other users.
 
so in general the 4870 is the best out there for 200?
all ill be running is a 22inch hd monitor on it as of now i have no plans of crossfire (cant anyways til i update my mb)(mine is sli ready) so the best bang for my buck is a hd 4870?
 
I'm quite happy with my HD 4870. I play everything at 1920x1200. I can up everything in far cry 2 dx10 mode with 2xAA. 4xAA gets me under 30 frames though...

Hardware is pretty expensive so I don't play with overclocks. I'd just buy a factory overclocked one.
 


That guy is just a fanboy, if you look at benchmarks the HD4850 is almost always on top.

I had someone argue with me that the 9800GTX was faster because it was clocked at 675Mhz while the HD4850 was only clocked at 625Mhz. He didn't even know what a benchmark was...



lol what a load of crap. PhysX is only used in a handful of games and that 3d game thing is just a concept demonstration with no real support, not to mention expensive. That's no different than an ATI fanboy saying that ATI is better just because of DX10.1 and that tessellation unit in the GPU. Sure the features are great but nobody uses them so it doesn't matter.
 
If $200 is your budget then i would definately go with the 9800GTX over the 4870....just cause of the price performance ratio, but if you can squeeze out atleast a few more dollars then i would say get a 260GTX 192 shader model.


now that is a true fan im leaning towards the Sapphire Radeon HD 4870 Video Card. that blew the gtx away on all the benchmarks i seen and for the little bit of money price wise it is worth it to me. also the gtx 260 wasnt that much higher on the benchmarks then the 4870. the 4870 won a few lost a few to that card but the price difference between those two cards isnt worth it to me.

here is the link if anyone wants to see all the benchmarks
http://www.hardwaresecrets.com/article/589/1

what do you guys think?
 
Seriously... you can't compare a 9800 GTX+ with an HD 4870... The radeon card has GDDR5 after all.

I friend of mine has the Sapphire HD 4870. The cooling solution is much better than on my Asus card. (about 50 degrees vs 80 and more silent as well)
It can also be overclocked to higher values than mine (850 vs 790 gpu and 1200 vs 1100 mem), but I don't care about those.

You have a 22" lcd so a 1GB 4870 isn't really necessary.

The GTX 260 (216 shaders) is too expensive.

So yeah... 4870 would be a best buy.

 
yeah i was leaning more and more to the Sapphire Radeon HD 4870 Video Card. like what i see.
as for o/c it is just an option if i wanted to. like i did to my 8600gt just to get it to perform better til i bought a new card. and hell for 200 it sounds like a good buy.
thanks everyone for everything
 


dude, havok isnt more popular, no games really use it unless they have to because the user only has an ati card. It is a lot slower because it is dependant on the cpu not the gpu, look at mirrors edge second level for instance, everyone with an ati card gets liess than 15 fps some get 1-2 while those with nvidia card get the regular 30-60. The same things holds true for Crystostasis. Notice both of these games are new more and more newer games utilize it. Stereo is the future and nvidia holds key, it is so immensive and impressive it will literally make you feel like you are inside the game. I use it on an lcd with cardboard glasses and already feel like this not to mention what it would be like with a 120 hz lcd with 100 dollar glasses. I mean you are looking at a world that isnt flat, and gameplay is just that much better.
 


i wouldnt use the words almost always here.

you count.

sites that pitted a stock 4850 against a stock gtx+

http://www.guru3d.com/article/radeon-4850-and-4870-review-force3d/ = the gtx+ is faster here

mass effect : gtx+
cod4: 4850
frontlines: gtx+
crysis: gtx+
wic: gtx+
vantage: gtx+
stalker: gtx+
fear:4850
grid: 4850
graw2: gtx+ (not counted, this game is so nvidia biased the gtx+ even beats a 4870)


http://www.anandtech.com/video/showdoc.aspx?i=3341 = the gtx+ won a split decision, won by a hair.

crysis: gtx+
cod4: 4850
quakewars: 4850
assassinscreed: gtx+
witcher: gtx+
bioshock: 4850
oblivion: gtx+


http://www.bit-tech.net/hardware/2008/07/11/summer-2008-graphics-performance-roundup/1

crysis dx10 (4xaa / 8x af): 4850
cod4: 4850
wic: gtx+
hl2: tie (gtx+ has lower min fps while 4850 has higher max fps, the reviewer failed to notice that, they gave it to the gtx+ here)
assassinscreed: gtx+
grid: 4850


gtx+ = 12
4850 = 9


 


Notice how I was talking about the regular 9800GTX without the plus.

The 9800GTX+ is much closer in performance but it's also generally more expensive.
 


you forgot to confuse him even further by mentioning that there are sometimes 15 varitieties of a single card like the 9600 gso which has a 384 mb version, 512, and 1 gig on top of some of them having more bits than others for bandwith differences.
 
"
what'dya mean easier to overclock? both camps has already OC support from their drivers. both drivers will let you use a slider to increase or to decrease mem/core clock. i think i didnt even burn 5 calories doing that."

I mena its less risky for nvidia, better throdding so you wont create lasting artifacts.
 


Minor problem with that.

The 3 top rated 4850's on Newegg right now, all with aftermarket coolers, are $159, $179, and $159 respectively. The 3 top rated 9800GTX+ cards are all over $170. If you find them for the same price, they are fairly even, but the 4850 is consistently cheaper, making it the better choice.
 


thats wrong. you make it sound like havok is a desperate countermeasure for ATI, which is a typical fan boy spat, wel it isn't. didnt you know that havok had been around for like forever?

uhh, duh, GPU-Accelerated PhysX just went primetime last year.
 


what? you mean throttling?

my 4850 sits @ 500mhz coreclock @ idle.

once your overclocking goes bad, and a transistor blow up, thats the only time you'll see lasting artifacts. that goes for nvidia and ati.

as long as none of the chips are damaged your lasting artifacts boohoo wont appear.

i hope you're not directly comparing the 738 core from the gtx+ and the 625 core from a 4850, because if thats what you're doing this conversation is over.

 


lol, Havok was around awhile before PhysX on the GPU. PhysX calculations also cause a performance hit since it takes away GPU power from working on the graphics. Also, there are still quite a few people with earlier Nvidia cards that don't support PhysX. Not everyone upgrades every 1-2 years.

Man you really are clueless.



I haven't heard any complaints from people with ATI cards, other than the HD4870X2 which apparently has issues. There are plenty of new games coming out that don't use PhysX at all.



Not happening anytime soon.



uh right

Sure is desperate fanboy around here.
 



yep i know that, let me quote myself for you 😀 . .





problem is my tally post was useless, turbo was talking about the normal gtx, i was talking about a gtx+. i didnt notice that itsy bitsy "+".

 
you just proved my point. Physx is newer, hence newer technology of the future while ati is stuck in the past, stereo 3d is the future as well. You may say its nieche and no one uses it thats only because people dont know about it, it hasnt really been advertised yet. Sony is already working on it and advertising it for the ps3 and doing part of nvidia's job so it'll take off real soon. And sometimes the best things are nieche anyway when it comes to technology. And ati lets you sometimes overclock too much before warning you unlike nvidia which is a lot more safe thats what I meant, its more safe in general.