Radeon HD 4850 Vs. GeForce GTS 250: Non-Reference Battle

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

spearhead

Distinguished
Apr 22, 2008
120
0
18,680
asus and gigabyte both have great non reference coolers. however some players such as sapphire actualy have chosen for a cheaper then the reference design cooler and cheaper mostfeds and chokes to save costs and that is why there 4850 is much loader then of any card with the reference design cooler. sure with the basic reference design cooler the 4850 gets quite hot tough but it is silent and it cools enough
 

10e

Distinguished
Jul 5, 2006
5
0
18,510
I think this would have been a better comparison if the 4850 were also a 1GB edition. Comparing a 512MB to a 1GB is not very fair and puts the 512MB card at a disadvantage in certain games.
 

razzb3d

Distinguished
Apr 2, 2009
163
0
18,690
I don't agree with you results in World in Conflict using the Radeon 4850.

My Media Center PC has a Sapphire 4830 1GB installed, an runs World in Conflict at Very High @ 1680x1050 @ 32 fps avarage.

How can a 4850 score less? Something is wrong here.

My media center:
- Intel C2D E8200 G0 @ 3,2ghz OC,
- 4Gb of DDR3 1600MHz CL 9 Kingmax Mars,
- Sapphire 4830 1GB...
- Asus G35 Micro ATX board
 

cleeve

Illustrious
[citation][nom]10e[/nom]I think this would have been a better comparison if the 4850 were also a 1GB edition. Comparing a 512MB to a 1GB is not very fair and puts the 512MB card at a disadvantage in certain games.[/citation]

Well, you can compare anything on a price/performance basis, and these cards have similar MSRPs of ~$150 or so. So it's fair game.

The sad part is, the 4850 Matrix had a $30 mail-in rebate before it was pulled from Newegg. ASUS implied they might re-introduce the card at a lower MSRP but gave us nothign concrete.
 

cleeve

Illustrious
[citation][nom]razzb3d[/nom]I don't agree with you results in World in Conflict using the Radeon 4850.[/citation]

You can't 'disagree' with a fact. That's like saying you disagree with the gravitational constant. It's not up for debate.

You can ask questions and hypothesize why there's a diffrence, though. If you look at the WiC numbers, it's obvious it's a very platform-limited bench as resolution (and therefore the video card) has very little affect on the results. It also looks like the game favors the Nvidia card - perhaps because of the 1GB of RAM, vs. the 4850's 512NB?

Now, in a platform-limited bench, CPU clockspeed will come in handly. Your CPU is running at a full 500 MHz faster than our test bed. Add to that, your 4830 has a full gig of RAM, compared to the 4850's 512MB in our test bed.

These factors might eaily explain the 5-fps-or-so-difference. Nothing to disagree with here, just the facts. :)
 
G

Guest

Guest
Sorry I made a mistake.. Author says "with a 40 FPS average frame rate .., the game is barely playable..."
Really? I guess regular TV and Movies are unwatchable for the author which come in at 24-30fps depending on the format
 

cleeve

Illustrious
[citation][nom]ken22222[/nom]Sorry I made a mistake.. Author says "with a 40 FPS average frame rate .., the game is barely playable..."Really? I guess regular TV and Movies are unwatchable for the author which come in at 24-30fps depending on the format[/citation]


No need to apologize Ken, you just need a little information to make things clear:

Television is shown at about 30 frames per second (actually 60 fields per second), and movies are shown at 24 frames per second. But there are two important differences between these media and games: 1. These frame rates never change, and 2. Movies and Television contain real-life motion blur which smooths things out.

Games don't have realistic motion blur (although there is some simulated blurring going on in newer titles), but this isn't the real issue.

The real, important difference in stat game frame rates FLUCTUATE. unlike TV and movies, they are not at a set concrete framerate.

So a game that's giving you 40 frames per second on average is likely dropping to 15 frames per second when a lot if stuff is going on, like when a ton of enemies come out and start shooting, or when the AI has to think about things.

15 fps is pretty much unplayable in a 'twitch' game like a first person shooter, because that framerate drop is going to screw with your aim.

Now, some games out there - like RTS games or RPGs - don't require the user to target that accurately. So these games are generally more playable at lower framerates.

I hope that clears things up!


 
G

Guest

Guest
I just wanted to add to an earlier point;
Nvidia used to claim (don't know if they still do) that a person would get an extra 10% performance give or take using Nforce+Geforce products together. They claimed this was due to some optimizations in the chipset. For the sake of knowing, it would be nice to see the comparisons done on both platforms just to know IF they optimize their own platforms that way.

As to the processor, you did mention on one test in particular, that the processor was bottle necking. Also, I think the 9 series C2Q are more popular these days, I use a 9450 for example, OC'd to 3.66 Ghz.

The OS I did not see as being to limiting, but it would still be nice to use the 64bit, as the video cards render different amounts of system ram available due to the different frame buffer sizes... The ATI here had a nice little advantage there.

Anyhow, great article all things considered.
 
Status
Not open for further replies.