Part 1: Building A Balanced Gaming PC

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
great review, but why in hell did you used the built-in benchmark for Grand Theft Auto IV?! that benchmark shows me 50 fps, and while playing i get an average of 30 fps. i`m not the only one
 
Can I do one of those win7 commercials but for Tom's hardware?

This is exactly the sort of thing I've been wanting to see and asking for in comments.
 
A little surprised that you tested the Radeon ATI 4850 with 512 instead of the 4850 with 1 Gig... Since all other cards were 1 Gig plus the 4850 never had a prayer...
 
Definitely a great article here, straight graphics card reviews are nice, but really don't help users a see the real picture because 90% of the time the test system is not the same as the user's actual system.

This article really makes it easy to see where my system would fit in the bigger picture, from the looks of it I'll be able to max out my computer with a 4890 (which is good to know because those cards got cheap! should save me a good chunk of cash, I was originally planning on a GTX285 or HD5850)
 
Great article, and more to come!

One area I'd recommend for examination: WHY do the dual gpu cards prosper when given more cpu (ie, the quad cores) than the single gpu cards?

- Is it when the game uses more than 2 cores?
- Is it the management of (internal) xfire/SLI?
- If not those, what?

Further on this, I'd like to see the same xfire/SLI solutions Tom's recommends in lieu of cards like the 260 to be tested in the same way, with the same cpus. Do those also only prosper when given max cpu? If so, or if not, why?
 
Best article in ages. Congrats! While the CPU/GPU charts are nice and informative (and keep us up to date on performance numbers), this is far better.

What I got from this:

1. The era of the dual core is over. Get over it. So when "experts" tell you it's better to get a dual core for gaming kindly point them to this article.

2. GTX295 becomes a monster when paired up with an i7. The 4870X2 isn't too bad eighter.

3. 5850/5870 should really be here. Phenom II X4 and the Athlon II X4 should really be here (you can skip the X2's or X3's as they are junk). Fingers crossed for an update.

4. A personal favourile of mine would be the 250 (ex 9800, ex 8800). Just to see how bad a rebadged solution really is. I think the 4850 will have a little trouble keeping up with it.

Overclocking might change a few things though. And not the way we expect it to.
 
So the chart seems to mostly confirm the assertion that for modern (Intel) dual-core processor running above 3 Ghz will generally max out gaming performance (except for ports which makes sense). An $80 E6300 running at 3.5 Ghz (FSB 1333)should mostly match a $280 i7 920 at stock.

So the money I shelled out for an i5 should have been spent on a 4890 then.

Sad to see the i5 not on the list, but then I imagine, the performance would be between the 2 quad-cores.
 
This is the best review I've seen in a long time because the review takes into account of a real world scenario.

Most reviews only uses the best hardware given to them (eg. fastest i7 CPU) which most people can't affort or is not willing to pay the premium for it (after all for less than half price, you can get a decent 360 or PS3 that has more games).

The review shows clearly that a cheap CPU can be a significant bottleneck. With the exception of the crappy 4850, ATI look to scale better on low end CPU. NV cards is best run with a top end CPU while ATI appears to run out of steam when given a very fast CPU.
 
Remarkably well done. The graphs were among the clearest I've ever seen here. That said, I'm not yet sure where to fit in some conflicting thoughts. It's a great point of reference, no question about it, but I think some conclusions must be made cautiously. I'm really looking forward to the AMD pairings, because it will really show more of the boundaries of the strategy "less on CPU, more on GPU" that is often taken when building a gaming rig.
I was a little disheartened to see how poorly my HD4850 does with these games, but since I have other information in my "index," I can offer a couple of pinches of salt: first, if you're willing to lower settings even a little, Tom's own articles on mainstream gaming show that even a HD4670 is competent for playing many current titles. Second, if you're still playing a favorite game that is a year or three old, lesser hardware does just fine. For example, the sorry showing of the HD4850 here did not suddenly drop my frame rates on Guild Wars at 1680x1050 below the monitor-pegged 60FPS. For what I play TODAY, an upgrade is not warranted.
Still, if a new game out there (please test Dragon Age: Origins!) needs a beefier card to play with high settings, this article would very clearly help me convince my wife that I need to spend $300 on a GPU to enjoy my game(s).
Finally, I realize it will be a much busier graph, but if possible it would be very useful to see one that has both AMD and Intel CPUs on it. The e6300 CPU and HD4850 GPU appear to be [mostly] losers, and I suspect the AMD X2 will be the same, so those might be left off if it adds clarity.
And, like everyone else, I am looking forward to the update that includes the HD5000 series in it.
Suggestion for the GPU OC graphs: instead of a line, consider a ribbon, with the lower border the performance at stock, and the upper border the performance when overclocked.
Thanks again.
 
Great article. Really like the presentation of the data. Graphs make perfect sense. Looking forward to more like this - or more of these when you have time to update with the latest CPUs and GPUs.
 
Bravo on the graphic format for displaying results. This is the most effective method yet for displaying data about graphics cards and CPU combinaitons. Tom's hardware started out being the first site ever to give useful information about computer hardware performance.
These new charts show that now the trick is to fit the hardware to the resolution of the monitor. The default high res monitor is now 1920X1080 There is no sense getting hardware that will perform acceptable framerates at a resolution of 2560X1600 unless you have a monitor that supports it. The problem with these large monitors is that except for the Gateway 30" model none of them have built in scaling and so they take a performance hit in that the graphics card is forced to do the scaling and this reduces performance.
Once again Tom, I have followed your site for years and this new graphical data representation is your best yet Kudos Dr John
 
[citation][nom]juliom[/nom]Why are factory overclocked Nvidia card being compared to regular ATI cards? Doesn't make sense.[/citation]

Why are fanboys voting me down? I just asked a question of something that's true. These Nvidia cards are factory overclocked.
 
[citation][nom]Article[/nom]But let’s be realistic, very few people spend $350 or more on a graphics card to game on a 19” LCD [/citation]

My parents pc :
i7-920, p6t deluxe, 12gb ocz ddr3-1600, 146gb cheetah (15k rpm), 1tb storage drive, a 4890 - and a puny 19" lcd ; that type of configuration does exist :)
 
This ladies and gentlemen was what Tom's has been missing for quite some time. An article that really brings it. This article hit the sweet spot for me, well thought out with excellent presentation.

Keep up the good work Paul, looking forward to the next part.
 

Yes, the same graphics cards were tested in part 2.


They weren’t. Our test samples are factory Overclocked, but as stated in parts 1 and 2 we downclocked to reference clocks.


More and more true since part 1 was completed. I updated the pricing chart for part 2 and agree with you on the 4870 X2 for future parts. HD 4850 and Q9550 costs have gone up too.


Thanks Jay. The charts do get bunched with too many CPU’s, espeically x axis labels. Every CPU added is like adding 6 more GPU’s to the bench time, which is already insane in a story like this. Unfortunately, we need to draw a line somewhere balancing a wide snapshot of hardware and a manageable time-frame.


It's been the method used for testing this game in other Tom's reviews. Couldn't test them all, but as stated on the GTA IV page, I didn't see nearly that level of discrepancy when comparing the benchmark results to in-game play on various platforms. What are your system specs? Also Fraps results from one run to the next weren't super consistent and averaging multiple runs of this game would have been nightmare for the amount of combinations tested. Didn't test COD WAW for similar reason.
 

Thank you, (and all the others), for the kind remarks! This first one took a huge amount of time, and it's encouraging for us to see the data was so appreciated.

I can't respond to all comments, but rest assured we are reading/re-reading each of them and will take your feedback into consideration as the series progresses!
 
THG is now the first review site I know of that has formally addressed what I have called "the equillibrium bottleneck theory". Way to go guys! Special thanks to Mr Henningsen for writting this article. This is the kind stuff I've been wanting to see for 8 years! I really really hope to see more of this in the future. Although feeding a CPU an over-kill graphics card in a CPU review, and feeding a graphics card an over-kill CPU for Graphics card reviews makes sense for theoretical purposes of product reviews, a benchmark conducted in this manner is a much more practical guide for system builders who don't have such skewed systems.
 
I would like to read Quake3 / Quake live tests.

And from lower resolutions, and "pro" configs (almost with all minimun) to the unplayable raytraced http://graphics.cs.uni-sb.de/~sidapohl/egoshooter/.

Why?
1) Because is the best FPS ever made, the most played, and now that is free with quake live more.
2) Because is playable from old Nvidias TNTs to modern computers, and hard gamers still play in 640 x 480 120 Hz in CRT with a 120 fps (as maximun in config) for stable playing (ask any ciber athlete pro) with ultra low details and special configs for playing better, even in fast network mode.
3) Because is very sensible to CPU and GPU advances
4) Is multiplatform, can be tested in Linux, Mac OSX, and of course, MS WOS (opengl)
5) It is played and it will be payed for every FPS gamer in the world, even the childs that are not even in their parents mind, is a classic, and everybody knows it, at least for basic knowledgement of FPS games.
6) Is a real standard that everybody can test and compare from their computers 7) You can use ioquake that is open source to make a special test batch, and even an interface for uploading results and save time in making benchmarks.
8) It has gamma, from lower resolutions to high resolutions and raytracing
9) and it will last, because is fun, classic, and there is not possible to play raytraced version with actual computers.
 
Interesting how 2 people can see the same article and get 2 completely different perspectives from it. Someone else said this proved to them that the era of a dual core is over. I see exactly the opposite. My preferred E8400 + HIS 4850 (I have the IceQ4 version) compare quite nicely to these more expensive high end CPUs and GPUs. And I know from experience that it will run Need For Speed Shift at the highest settings at all but the highest resolutions on my 24" BenQ (at 1920x1200 some settings must be taken down to medium). I do have to qualify my opinion with the fact that I don't play Crisis.

I expect that when the overclocking results are compared, the dual core will close the gap even more and still prove its a worthy consideration when budget is a factor. Bang for buck, the dual core is still the best option, but that may change when the new i5/i7 CPUs are added to the list (I suspect the 860 will topple the 920 for overall performance making it better value).

And I'm tickled pink that TG finally used an E8400 rather than the more expensive E8500 (they perform identically). One of TG's better articles IMO. Can't wait to see the rest. 🙂
 
i was surprised as how the HD4850 performed. I didnt know that it was that weak compared to a gtx 260. I thought that a 4850 is about 15%+ lesser performance than a 4870 and a 4870 is equal or a little less better than the vanilla gtx 260.
 
Status
Not open for further replies.