Check out the E4300. I think it may play well in this budget/performance space as well.
http://www.dailytech.com/article.aspx?newsid=3372
Do they always have to OC? I'm sick of seeing every single Core 2 Duo chip OCed. Yes, I understand that those reviews were done for enthuisiats and that the Core 2 Duo OCs like mad... but most people do not know how to OC. 🙁
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.
Yeah, and even though I am a total Intel fanboy, it is biased towards Intel. They don't mess with the results themselves, simply the way they display them. While it may look like the 3800 is getting totally PWNED in the benches (which it basically is), it's not as intense as it may seem. The graph markers are sometimes way into it to start out so the visual difference is bigger. Like this:
![]()
if you look at it, the actual numerical difference isn't that big, however when graphically presented, it seems much larger. While it ocul dbe argued this is to save space, not all of them do it, only the ones with narrow margins, which again could be argued that they're trying to show a higher-detailed difference. Still though, seems a little one-sided, though I will add that AMD is indeed getting its ass kicked.
Once again the 3800+ X2 is smoked by the E6300 which is 33% faster in the Super Pi test. The E6300 should really blitz through aplications which are FPU dependant.
I am not sure I trust this review. On the games, they used Fear and Doom 3 @ 800 x 600 (ugggghhhhh) on Medium setting using an ATI x1800xt.
What can the processors do on high setting. on 1024x768 or 1280x1024? Since a lot of people are going with Flat Screens now, these are more typical screen resolutions. Does it change the outcome therefore they do not show it?
I have 3 computers in my house and I have a mixture of AMD and Intel as well as ATI and NVidia. I look at Performance/Price and I take into account the natural screen resolution of the flat screens in my house before buying.
Just my 2 cents
Yeah, and even though I am a total Intel fanboy, it is biased towards Intel. They don't mess with the results themselves, simply the way they display them. While it may look like the 3800 is getting totally PWNED in the benches (which it basically is), it's not as intense as it may seem. The graph markers are sometimes way into it to start out so the visual difference is bigger. Like this:
![]()
if you look at it, the actual numerical difference isn't that big, however when graphically presented, it seems much larger. While it ocul dbe argued this is to save space, not all of them do it, only the ones with narrow margins, which again could be argued that they're trying to show a higher-detailed difference. Still though, seems a little one-sided, though I will add that AMD is indeed getting its ass kicked.
I am not sure I trust this review. On the games, they used Fear and Doom 3 @ 800 x 600 (ugggghhhhh) on Medium setting using an ATI x1800xt.
What can the processors do on high setting. on 1024x768 or 1280x1024? Since a lot of people are going with Flat Screens now, these are more typical screen resolutions. Does it change the outcome therefore they do not show it?
I have 3 computers in my house and I have a mixture of AMD and Intel as well as ATI and NVidia. I look at Performance/Price and I take into account the natural screen resolution of the flat screens in my house before buying.
Just my 2 cents
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.
I like how Mrs. D's remarks are:
whole lotta cache
and:
65nm
as the reasons Core performs so well. NEVERMIND this review is talking about the 6300, which has only two (2) MB of total L2 cache, the same as the Pentium D series and the Althon FX dual core series. I also don't see how the die shrink comes into play here. If he is trying to make some inference on how a die shrink allows for higher clock speeds, this is the POOREST example for which to use that argument. The 6300 is running at 1.83 GHz, which was easily attainable using Intel's 90 nm Yonah core. The die shrink itself had no hand in making the E6300 the monster it is, the design had everything to do with it. Too bad an ignoramus like Mrs. D will never understand this concept.![]()