AMD Ships Radeon 4850 Graphics Card

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
ATI's graphic cards are only good on paper. It actually performs pretty bad in games. TBH, I am not a fanboy and I have both ATI and Nvidia. The problem is that making a video game is not cheap. The Game developers need to get as much as they can. Nvidia certainly donate a good amount of money to them, so they have to optimize their game for Nvidia in return. Some of the popular titles are also based on OPENGL. which is introduced by Nvidia. That is why most games play best on Nvidia card. ATI's lacking of performance for Single card solution also hurt his marketshare. TBH, most people don't have dual card setup. Maybe 5% of the total gaming population has dual card setup. That is why the majority of games can't take the full advantage of dual card. BTW, lots of games nowsday are really made for game consoles, so optimize for single card solution is a priority for the game developers. They won't waste their time and money for 5% of the market share.
 
nVidia does a good job of donating technology with better support to developers than AMD does, this is very true and I can speak for that in personal knowledge. If you're arguing the role of game consoles then consider the fact that it's an ATI card that is in the Xbox 360, arguably the most popular console, so, logically X360 games that translate to the PC should play better on ATI's.. right? No. You're partially right on the nVidia front. Generally the product better supported by the developer is likely the one to run better. ATI support is generally considered as an afterthought leaving it up to the driver developer to make sure everything works. ATIs technology is fine, but it's driver support is often what really strangles their performance, as evidenced by the fact they release a driver like every few months, yet despite that, they remain competitive. To say they "perform pretty bad in games." is way off base. So far it's not even in the same state, let alone city, as the ballpark where the base originally sits. Do they perform worse? Certainly. In a gameplay situation, all things considered, when the bullets are flying, I'm not always noticing a considerable difference between my ancient x1950xtx and my friend's 8800GT, even though he's playing on bit larger of a monitor with a more eyecandy on. It's definitely not so much that I'm chomping at the bit to make a new purchase now, now, now! (if offered one for free however.... :)
 
What everyone continues to ignore when comparing ATI and Nvidia grpahics performance is that ATI gpu's are designed for DX10 games running on 64bit OS's going back to the 2900xt, which despite being 9 months behind the 8800, was still the first truely DX10 compatible card. Nvidia is still basing designs around DX9, with added DX10 support optimized for 32bit OS's. Assasains creed, as mentioned earlier, gets 40fps minimum average of 55+ in 1680x1050 in dx10 with all setting maxed and 16x AA on my AMD 939 toledo core 4400x2 clocked to 3ghz, with my 2900xt gpu at stock speeds. And i'm running 64bit vista ultimate. If i drop the AA down to 8x i get the same performance at 1920x1200. Take into account that the 3870xt performs about 10-20%, has a fraction of the power consumption and only costs $150-170, nothing to really complain about.

Their are two kinds of people that think ATI just looks good on paper...those still running 32bit OS with dx9 games, and those running 64bit os's with intel CPU's. Yes, intel is the cpu performance king, if you don't mind paying an extra $1000 for 20-30% performance gain. So long as you keep that cutting edge hardware running WinXp...because intel fails in 64bit performance...as well as memory performance....and multi-core compatible software...ya know...just the supposed improvements that 64bit dual/quad core cpu's offer.

My old 939 rig runs both cpu cores equally in full screen apps...like games in 64bit vista..and yes while 32bit vista does offer dx10 compatibility, when it has system memory limit of 2gigs, and the OS eats up 1gig-1.5gigs....it doesn't matter how fast your gpu is, especially if those spiffy 1-3 extra cpu cores aren't being utilised by the most demanding application....and there is only 400-500 megs of ram free.

Crysis, another Nvidia game, runs perfectly fine on ULTRA settings in DX10 at 1440x900, 30fps minimum ,average of 45-50 on my system. Of course, it runs both cpu cores to 100% (with the game eating up 80% of that at minimum) as well as 2.4 gigs of RAM.

ATI's main driver problem was back with the 2900....which had alot to do with the fact that their was still almost NO 64bit vista driver support for most hardware at the time the card was released, and it wasn't intended nor marketed for XP compatibility and use. However..nvidia was plauged with driver problems when the 8800's came out...oh and physical design flaws as well. I have yet to read a review of an nvidia card that doesn't have problems running DX10 games in 64bit OS's, and there is almost the guarentee of SLI not functioning on new cards with the release drivers in one OS config or another.

ATI does more than adequetly, especially for the price, more so when run on intended platforms. And again, it doesn't matter how good your gpu is, if your cpu is bottlenecking the system...or your ram performance is pathetic...or if there just isn't any ram available to access
 
[citation][nom]Barzenak[/nom]I have been a fan of ATI for a while now. Even if they don't always have the fastest fps I have never had one fail and I have enjoyed excellent picture quality. I used to be a devoted GeForce user but I think after 2 dead cards I am done with them no matter the fps race.[/citation]


i had a lot of geforces as well. had no one to fail on me, still, i would never go back to nvidia in a thousand years. after laying my sight on the screen for the first time after my hd3850 purchase, i knew what image quality is. i have this confirmed from other people as well. they say ATI is all the way up there with matrox. sorry nvidia, high prices, low image quality and having to buy a 800w PSU isnt going to get you a customer here
 
[citation][nom]iocedmyself[/nom]I have yet to read a review of an nvidia card that doesn't have problems running DX10 games in 64bit OS's[/citation]

I don't think you've read many reviews, in that case. I'm running 64-bit Vista and I've played almost every DX10 game available with zero problems. Try again.
 
[citation][nom]semper-fios[/nom]I don't think you've read many reviews, in that case. I'm running 64-bit Vista and I've played almost every DX10 game available with zero problems. Try again.[/citation]
I am also running 64-bit Vista and I have played almost every popular DX10 games. I never got any problem with my Geforce8800GTS. However, I got a few issues with My ATI 3870X2
 
[citation][nom]stoneeh[/nom]after laying my sight on the screen for the first time after my hd3850 purchase, i knew what image quality is.[/citation]
I have a very hard time imagining what that means. How was the quality better? Isn't the display the most critical component as far as image quality goes?
 
My best friend bought a spanking hot Alienware dual 8800GTX hyperbeast, fastest intel quadcore at the time 3+GHz, and the PhysX card and all the monkeys.

He downgraded the OS to WinXP and used DX9 because crysis etc wouldn't run on the max settings he was (rightfully) wanting it to run at on Vista and DX10. (it did in XP with dx9, and it looked better than dx10)

Just to let you all know there may be some truth in what is said about intel and 64-bit/nvidia dx9 by iocedmyself a few posts above.
 
Image quality has no relation to the speed of GPU. I believe Matrox is still the king of image quality, but sadly, Nvidia has changed the "game" by forcing people to believe "Faster is Better".
 
Just an info. I`m working for one PC magazine, and i`ve got ATI Radeon 4850 for reviewing it. Tests was 3dmark and 3dmark vantage, also I put it into Crysis and World in Conflict. I have 2 machines for testing. One is Sapphire Pure AMD FX790 MB, AMD Athlon 6000+ X2, 4gigs of ram Geil Ultra plus, and other is Gigabyte EP35-DS3 MB, Intel QuadCore Q9300 @2.50GHz, and 4gigs also. Tests was pretty dissapointment, because Vantage out was P6076 as final result on Intel platform. In first 2 tests, where GPU is benchmarked i didn`t get over 27FPS. My private XFX NVIDIA 8800GTS had P5872 as the result of testing on Intel Platform. Tests was performed on Vista 64bit, because it`s only software where I can get out maximum of hardware, unfortunatly.
AMD Platform was a whole different story. Vantage put out P6892, where NVIDIA showed me just a small progress from Intel platform, and that was P5977 as the result.
Crysis is not an optimized game for anything, just like a far cry was, if U remmember that. Cryotek Engine is not self improving, because their engine is not based on software instructions through directX, as a matter affact they are completely using Hardware resources, and that`s why u can`t accomplish over 75-80FPS in benchmark tests.

Crysis was on ATI between 37-61FPS, while average was 51-52, 1280x1024x8AAx16AF... Same on Intel and AMD. Nvidia was a true contest to this because, like i said, it`s not optimized at all, but it`s game based for NVIDIA. Nvidia shout out average of 58 on intel platform.

World in Conflict crashed on ATI Radeon 4850, because I had only tryout drivers from AMD-ATI, and it was a really buggy.

Temperature is 79Celzius deegrees, while on overclocking the card for about 30%, temperature goes only 3deegree up 😀 It was a pretty nice experince for me, as a Gamer and longyear hardware tester.

Stories about 55nm technology about less heat, and advance performance is a bit of crap, because cooling on Sapphire ATI 4850 is same as on Sapphire ATI 2600XT HDMI with ddr4.
Card shipped with 512MB GDDR3, and with Core Clock: 625MHz, Shader Clock: 825MHz, Memory Clock: 1143MHz... I`m expecting from AMD to get 4870 XT with GDDR5 so I thing it will be a real competitor to upcoming GTX 280 Chipset in NVIDIA...

Retail price will be 199$ and it`s coming for a weak on market.
 
Status
Not open for further replies.