PCI Express & CrossFire: Scaling Explored

Status
Not open for further replies.
should've included 1920x resolutions in the last page, as there are a lot of people out there with screens capable of that resolution.. but anyways, all in all a very good and informative article.. but i'm going to settle with a complete makeover when core i7 becomes more available!
 
[citation][nom]V3NOM[/nom]yer kinda interesting to see how things have changed with new mobos but it doesnt really have any practical value tbh.[/citation]

It's all about answering the question "Will a second card do the job".

Lots of guys have midrange or better ATI graphics cards, and the question of "upgrade or replace" is constantly being asked.
 
@ arkadi
Yes the x58 is out.
However, as it can not be paired with a Core 2 CPU and runs DDR3 exclusively, you can not directly compare the results.
In general, I would assume crossfire on the x58 will scale similarly to the x38/48 as they both have the same PCIe configuration.
 
[citation][nom]outlw6669[/nom]Thanks for finally getting this review out![/citation]

It was planned for September but kept getting delayed due to tight deadlines on other articles. But when the economy finally went from a slow decline to a nosedive in November, we knew this article had to come out right away. More people are putting new systems on hold and looking for ways to keep their old ones up to current performance standards, and we care about upgraders just as much as system builders.
 
Good work!.

Altought, I have an Athlon X2 system, and probably gonna update to a I7 920. It would had be better comparing to an cheap i7 as a reference
 
This article shows that even in the best conditions, x48 vs p45 is at most 5% difference. Price-wise, this confirms my observations that the lower priced P45 boards are much better performance/value than the x48 premium counterparts.
 
I understand it is more testing, and you already had several months of delays but it would have been nice to see 1920x1200 numbers. 24" monitors are now in the mainstream affordability range with prices ranging from $249 to $349
 
I might be missing something, but it kinda looks like a Phenom 9950 paired with the 790FX SB750 would be comparable to the X48. But really, what am I missing? I can't find a direct comparison anywhere.
 
Sorry: bit of an oversight on my part. CPU charts of course, though the AMD board is using the older SB600, but the performance difference shouldn't be much different.
 
[citation][nom]Roland00[/nom]I understand it is more testing, and you already had several months of delays but it would have been nice to see 1920x1200 numbers. 24" monitors are now in the mainstream affordability range with prices ranging from $249 to $349[/citation]


You're right! The problem is trying to test a whole bunch of different resolutions. 1920x1200 is almost right in the middle between 1680x1050 and 2560x1600, so hopefully most people can figure out "about" where that resolution would fall on the charts.

Is it time to get rid of 1024x768? I'm in favor of ditching that resolution and picking a different one.
 
I'm trying to figure out something after reading this article, maybe someone could help me understand??? It seems that a SINGLE Radeon HD 4870 still have enough bandwidth into a PCI-E 1.1 slot, and the differences in performance compared to PCI-E 2.0 came from the chipset (P35 vs. P45 in SINGLE card configuration). Am i wrong?
 
It appears that the cards work fine with PCIe 1.1 at x16 width. When you reduce the width to x8, PCIe 1.1 doesn't appear to have enough bandwidth for some games. When you drop the width to x4, things get much worse.
 
Crashman said:
You're right! The problem is trying to test a whole bunch of different resolutions. 1920x1200 is almost right in the middle between 1680x1050 and 2560x1600, so hopefully most people can figure out "about" where that resolution would fall on the charts.

Is it time to get rid of 1024x768? I'm in favor of ditching that resolution and picking a different one.
I know is far more work doing another resolution on top of the 3 you are already doing. You are doing 33% more testing and that will translate into doing dozens of more hour of work (and thus dozens of more hours delay on this article as well as others for an article that you were planning for September.) I understand and symptahize.

It is just my belief that 24" monitors are becoming more of a "sweet spot" in the market due to prices going down due to technology and overproduction in a time when demand isn't so hot (I mean it is crazy you can find good 19 inch monitors for $99 to $129 now. And Frys had a 22inch Samsung for $179 last week). It is now possible to get a 1200p or 1080p monitor for the mid $200s to $300s when last year you were talking $500 to $600 for the same monitor.

And yes I can guess where 1920x1200 will end up with, but the problem is we are seeing geometric growth after 1680x1050
224% more pixels going from 1024x768 to 1680x1050.
232% more pixels going from 1680x1050 to 2560x1600
30% more pixels going from 1680x1050 to 1920x1200

Yet we see this performance increase of adding a second crossfire card
3% High 1024x768 p45
19% High 1680x1050 p45
93% High 2560x1600 p45

Jumping from 19% to 93% is a big jump, yet you don't get that jump from 1024 to 1680.

----

If it is only a marginal 25 to 30% jump in speeds it may be better just to save up on a next generation card instead of buying a second one. The 3850 to 4850 jump was much bigger than 30% as well as the 8600 to 9600. (Though the 8800gt to gtx260 wasn't that big of a jump)

----

In full disclosure I don't even game at 1920x1200 instead I game at 1920x1080 using a HDTV as my monitor. Yet people like my younger brother is considering building a computer, and the decision to get a 4850 512 vs a 4850 1gb makes a difference for while they have small differences due to memory in normal games, there is a good difference in crossfire between the two due to the 1gb having twice the memory bandwidth and having to store twice the amount of data due to a non shared memory buffer.

----

Thank you for the article though, I just had a small complaint but overall it was very helpful.
 
[citation][nom]Crashman[/nom]You're right! The problem is trying to test a whole bunch of different resolutions. 1920x1200 is almost right in the middle between 1680x1050 and 2560x1600, so hopefully most people can figure out "about" where that resolution would fall on the charts.Is it time to get rid of 1024x768? I'm in favor of ditching that resolution and picking a different one.[/citation]

Yep, it easy to see where the 1900x1200 results would be. Thank you very much for very usefull article!
I am not so sure that dumping 1024x768 is a good idea. I by my self have not used that resolutions for years, but it's the standard vga resolution, and many gamers with 4:3 screens use it, when their system is not fast enough to run the game at higher settings.
But because we are moving to widescreen 16:10 and now even 16:9, it will soon be the time to totally move to widescreen resolutions. Then it's even usefull to forget other 4:3 resolutions like 1600x1200 and smaller.
The reason is that it's relative easy to extrapolate 4:3 results from widescreen results.
I think that maybe even next year you can use only 16:10 or 16:9 results and give guidelines for those who can not calculate what "old" resolution is nearest to each widescreen results tested. The total picel amount is what counts anyway!
 
Status
Not open for further replies.