Two processors?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bikermicefrmars

Distinguished
Oct 13, 2009
88
0
18,630
E6300 45nm and E5300 45nm

Now Lets say both processor achieve 3.5GHz (overclocked)

E5300's FSB will be 269 x 4 = 1076Mhz

E6300's Fsb will be 333 x 4 = 1332Mhz

As both will be on 3.5ghz,.....one has a higher FSB , while other has higher multiplier...HOW MUCH GAMING PERFORMANCE DIFFERENCE WILL be there between the do??Which is a better buy if one intends to OC till 3.5ghz , and play games at 1280x720 resolution...coupled with a nice and stable overall rig including GTS250 and 4GB ram!
 
Well the E5300 has 2MB of L2 cache, while the E8400 has 6MB of L2 cache, so there is a bit of a difference in performance. This is probably the best example I can show you, since it will be using the same resolution as you as well as similar specs.

http://www.tomshardware.com/reviews/cache-size-matter,1709-5.html

As you can see, the 4MB cache does a much better job than the 2MB cache. The difference isn't big, but there is still a significant difference. However, a 2MB L2 cache E5300 is approx. $67, while the 6MB L2 cache E8400 is $168. If you think a 10-15% performance difference is worth that $100 dollars, then I would say go for that E8400.

Although, if you are gonna pay $168 for such an old CPU, I would have told you go to get the new AMD Phenom II x2 555 for $100 and a nice unlocking/OCing board for about $70-$80, which is almost the same price for much newer/better tech. If you're lucky, like 70% of the people who do try to unlock, you'll get a fully working tri/quad-core that can OC anywhere from 3.6ghz to 4.0ghz+. Even though this CPU only has 2MB of L2 cache, it has 6MB of L3 cache, meaning that it evens out in cache performance. BUT that's if you were thinking about spending that much.

I would still recommend the E5300 and just spend that $100 into a 9800GT or an HD4850.
 

Exactly. It was Intel's decision to put I7's stock clock lower, it doesn't mean AMD overclocks worse. If they all reach the same value when OC'ed it doesn't matter by how much they're OC'ed.
 
Honestly, I can't really tell the difference anywhere past 30FPS. Movies run smooth at 24 FPS, so that's probably the minimum I would go for in gaming. Maxing out Crysis left me at 20-30FPS, so most of the time, the game play was smooth. There were times when there were too many explosions and my FPS would drop to like 5-10FPS, but would only last for a second or two.

Again, at your resolution, I expect any of the GPUs mentioned before to max out games with anywhere from 60-100+ FPS.
 
But HD4650 is a low-end it wont be able to do above 30FPS even at 1024x768 resolutioin.....I think tomshardware have misinterpreted and written that it delivers great gameplay at 1280x1024 and 1680x1050 with lowered detail?

What say u?Can tomshardware misinterpret?
 
Well if you don't feel like it's up for the task, a stronger GPU is always better to get. Though I'd have to say that Tom's recommendations are on-par. My GF's 9800GT 512mb does max most games at 1680x1050 and a HD4650 should be about half of that power. Your resolution is fairly low, so I still expect the 4650 to max out almost all games for you. It would probably be best if you make a thread in the GPU or ATI sections and ask about this.
 


I don't think you understood what I meant...

I was talking about OC headroom and how stock clocks matter. (to the average user, not us)

When people look up benchmarks for products they are lookin at the stock clocks. This means that for most people their 'perceived' headroom is the amount of performance they can gain from the stock clocks (ie; those benches they are looking at)

TBH I'm 100% with you that CPUs should be tested at a clock for clock (or even better, match the clocks so they both have the same OC headroom) performance and base their OC headroom off of simply the clock rate. However their marketing schemes make this not possible for the average benchmark/review.

You don't need to educate me on CPU and GPU performance.

The CPU is very much important for gaming (or was), but CPUs today are so powerful (overkill) compared to GPUs that they make no difference, which you pointed out. However, once you get a stronger GPU in there you'll need a faster CPU to keep up.

Yes, you gain no FPS from 2ghz and 4ghz with a 4890. This is because the 4890 only needs 2ghz of your CPU to run at its full capacity (obviously). Now throw a 5870 or a 2nd 4890 in there. Do you still believe you will only need 2ghz to run your GPU(s) at full capacity? The stronger the GPUs, the more CPU power you'll need to back them up. Is this not obvious?

Yes, i7 and i5 are overkill for nearly all GPU setups, they should be considered more of 'future-proofing' for your upcoming GPU setups. (or if you need an extreme GPU setup) And the power/heat saved from being able to clock lower is a plus as well.

I like AMD and I like Intel. I have owned the best of both worlds (Athlon X2 4400+ 1MB 939 to current yorksfield Q9550, yeah they both aren't the best anymore, but they were among best of their time).

Btw toms only submits the i5 because the i7's ht is no better (for gaming, yet).

AMD aren't just as good, they're good enough. There is a BIG difference.
 
I can tell the difference all the way up to the 100s of FPS. Of course the difference has to be at least 10% or more.

Not all games will show much of a difference but a good example if CS:S. I play that game at 240-300(max) FPS. I went on my friend's computer who had it running around 150 and I could definitely tell the difference. This is mostly due to the fact that I've played CS for 6 years (and over 6 months on my current setup) so I am very used to the 'feel' of my FPS.
 

I just can't tell the difference regardless of what game I play. I've played CS 1.6 for 8 years now and it still doesn't look any different at 60FPS (vertical sync) vs 100(max)FPS. Same for any of the source games (L4D1/2, CS:S, etc), 60FPS vs the 100-200FPS has no differences. Only game I can really tell the difference with is Crysis, but that's a different matter.

I do get what you're saying about the multi-GPU setup, but that's just irrelevant to the OP's situation. If you already have a single GPU giving you more than 60+FPS, there would be no noticeable difference with a second GPU, except for benchmark scores, especially at 1024x768. So in the end, the CPU still has a small impact in the OP's case and he could just get any of the CPUs he listed. All I know is even if the CPU is the bottleneck of the CPU, if I'm getting 60+FPS, there would be no need to upgrade the CPU, until I'm see 30ishFPS performance. If you have money to throw around then upgrade to your heart's desire, but it looks like the OP doesn't even have high-end stuff, so I figured it'd be better to save money and upgrade his overall system performance, instead of a killer CPU with a weak GPU.

I also don't get the "feeling" of your FPS, but it might be the same thing when people say AMD+ATI gives them a "smoother" experience than Intel+Nvidia. My PhII 955(4.0ghz)+4890Toxic(close to GTX285) looks the same as my GF's Q6600(3.6ghz)+9800GT(stock), even though my specs are technically higher. To each his own, eh?
 



What are you smoking?


First off this is dependent upon who has the best hardware during that time frame...

And right now it is more like intel with ati takes the win on best experience considereing that intel has a giant margin over amd in performance.. And Ati has the worlds fastest graphics card at a fraction of the cost of nvidia's best card. You need to keep up with times man... Not saying amd is bad or anything (considering I have a phenomII in my computer.) but Intel just has better processors even though you pay more for them....
 

Yep, but honestly, once you get into the $150-200+ CPUs, they all are gonna look the same if you test them using the same GPU, as long as the GPU is pretty good. I have tried both my 4890 and my GF's 9800GT on my rig and games look just about the same. I'd have to be standing still to notice texture differences, but even then it's very minimal.

The best combo is whatever makes you happy. 😀
 



I agree.. its just that noone can say oh this company with this company will always be bes,t as you know things change. Times change, hardware becomes more advanced, and the two major companies have a giant competition Intel/Amd. You do have to go more into the 300+ range to see a difference between intel and amd processors. But for the money nobody can deny that Ati has the market cornered in Gpu's for the time being!
 

i7's don't show any big improvement except for heavily threaded games, which are few in numbers. It's not to say that future games will make use of it's capabilities, but right now, it's just way too powerful and not utilized for 99% of the games out there. 10 days until Fermi comes out and I'm really curious to see it's performance to price ratio vs ATI.

EDIT: @OP By the way, Newegg has the E5300 for sale for another 46 hours. Use code "EMCYPZT27" to get the E5300 for $61.99 with free shipping.
 




I agree from a gaming aspect... however rendering movies and using programs like photoshop and solidworks, inventor, Rhino etc. is where the more expensive i7 excells. for a gaming aspect it would be wiser to buy a phenom for price alone! And it is unlikely that future games will require more processing power as they are leaning more and more heavily on the Gpu. Gpu's are getting bigger, faster, more complex every year. However if the amd and intel road map is played out the way they want the gpu and the processor will eventually be one unit! then the only choice you will have is amd or intel not a little bit of both!
 

I'm not too sure about this. Fermi can only operate as a single GPU card and it can't ever compete with the dual-GPU 5970 because of this. I'm sure Nvidia will drive prices for a GTX480 higher than a single 5870 though, especially since they claim it performs better. As for drivers, each company has their own strengths and mistakes. ATI has been working hard to get all the 5000 series problems ironed out and has had a decent/stable set of drivers since the 9.x drivers. Nvidia has always been solid, though they did have that overheating bug recently and people sometimes complain that different games require different drivers to run stable.

I have high hopes for bulldozer, as AMD will finally have a new architecture out. I do hope it's just not an i7 killer because Intel will definitely have something new and better by then and it will be another "Phenom II vs i7" battle.

@raidur Why would you need to see a Phenom II vs i7 review? It's obvious the i7 will win by a slight margin just because it can scale those GPUs better. Though if you really wanted to spend $800-$1000+ on graphics alone, I'm pretty sure most people wouldn't mind paying just a little bit more for an i7 system vs a Phenom II system. A single 5870 will already max out 99% of all the current games out there. Even if you cross/tri-fired them, they'd have so much graphic computing power that you'd literally see no difference between a Phenom II and an i7, since your FPS will probably belong to the 150-300(assuming this is max) range. The only benefit from an i7 system would be benchmark scores and benchmark scores don't necessarily help you with real world applications when you're already getting crazy FPS in a game.
 
Benchmarks/future proofing. Only 2 reasons when it comes to gaming.

I do agree with you. Unless you play on a 30" a single 5870 is more than enough.

I believe that when the gpu gets stronger we'll see gaps between ph2 and i7 like we do at low resolutions, not just a slight margin.
 

Well the current high-end GPUs are already strong enough, what we need are games that will stress the system depending on what hardware is on the system. So for example, a game could sense having 4 cores vs 8 cores and it will add more physics/actions on the 8 core CPU or it will see that there are 2/3 GPUs and it will process more complex shapes or add better lighting to make it look more realistic.