Intel-1.86ghz core-2-duo delivers

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Tip: If you p*ss off 90% of the forum users, and the mod's enough to get banned, and you make a new account....don't put the same lame location down...show some mentality. We already know that your head is in the toilet, so loose the underwater location.

PS. Were you and MMM siamese twins, split at birth?I'm going to assume that you were conjoined at the head, and both of you lost some brain matter in the surgery.
 
Check out the E4300. I think it may play well in this budget/performance space as well.

http://www.dailytech.com/article.aspx?newsid=3372

Is the E4300 "official" or is it just a OEM part?

If it's official it actually might be better thgan the E6300.

Starting out at the 800 mhz FSB and over clock to 2.4 ghz would put the bus at 266/1066 so all the ram, graphics cards etc would still be at stock speeds.

Lets say they price it at $139 (The E6300 is $183) OCed to 2.4 you get FX62 performance for under $183!

KIller chip... leaves the X2 3800 deader than dead if it turns out to be true...

Looks like that might actually be the case and looks to be targeted squarely at the 3600+. It also appears that they will be retail, and not just OEM. Most sites say they don't expect to see them before Q1 07, but the way Intel has been pulling in it's roadmaps, I wouldn't be surprised if we saw them by Xmas. Here's more links:

http://www.anandtech.com/weblog/default.aspx?bid=282
http://www.tweaktown.com/news/5993/index.html
http://www.nordichardware.com/news,4213.html
http://www.neoseeker.com/news/story/5949/
http://tw.giga-byte.com/Support/Motherboard/CPUSupport_List.aspx?ClassValue=Motherboard&ProductID=2314&ProductName=GA-965P-DS3
http://www.madshrimps.be/forums/showthread.php?s=&threadid=25662

Another view about the E4300's possibly sweetness. 😛

http://www.overclockers.com/tips00999/

Could be the bargain of the century. 😀
 
Interesting review, but for the many AMD users, this comment is misguided IMO:

"Considering that both CPUs cost around the £135 price range, there is absolutely no reason why you should pick a 3800+ X2 over the E6300."

For me, I would have to sell my A8N-Sli Deluxe and RAM which would probably get way below there value, then buy one of these expensive mbs that all seem to have problems here and there anyway .. .

I figured moving to the X2 3800+ (and itll be more like £105 hopefully) and better RAM would be £210 be cheaper than moving to the E6300. The difference in performance probably doens't warrant that, especially as I would need to learn all about Intel and how to OC with intel and stuff.

Anyone agree with me or am I talking rubbish?
 
I am not sure I trust this review. On the games, they used Fear and Doom 3 @ 800 x 600 (ugggghhhhh) on Medium setting using an ATI x1800xt.

What can the processors do on high setting. on 1024x768 or 1280x1024? Since a lot of people are going with Flat Screens now, these are more typical screen resolutions. Does it change the outcome therefore they do not show it?

I have 3 computers in my house and I have a mixture of AMD and Intel as well as ATI and NVidia. I look at Performance/Price and I take into account the natural screen resolution of the flat screens in my house before buying.

Just my 2 cents
 
Funny. I seem to remember plenty of people whining about comparing K8 with a 'product that doesn't even exist' when AnandTech's first benchies came out. Now that it's out you're all bellowing about how comparing Core 2 Duo to 90nm(which is all AMD has right now) it, like, totally unfair, and the only fair course of action would be comparing it to AMD's 65nm... which doesn't exist. Double standards, anyone?

Get real.
 
They use low resolutions and medium quality so it dosent get bottlenecked at the GPU so then the CPU would only be performing at 70% load becouse its constantly waiting on the GPU. Running at lower res and quality will show the tru potential of the CPU.
 
"Considering that both CPUs cost around the £135 price range, there is absolutely no reason why you should pick a 3800+ X2 over the E6300."

I think this is incomplete, It should end with "Considering that both CPUs cost around the £135 price range, there is absolutely no reason why you should pick a 3800+ X2 over the E6300 If youre getting a new system."(which isnt the majority of us.) I would love to upgrade to one of these babies, but at around $450 for the complete upgrade will make me wait.

Like alot of us are saying;Intel wins price/performance ratio hands down, BUT its not like you only need the cpu, you need an expensive mobo & new RAM.
 
Do they always have to OC? I'm sick of seeing every single Core 2 Duo chip OCed. Yes, I understand that those reviews were done for enthuisiats and that the Core 2 Duo OCs like mad... but most people do not know how to OC. 🙁


http://www.xbitlabs.com/articles/cpu/display/core2duo-e6300_10.html



These reviews show a X2-3800 verses a 6300 both stock and overeclocked...

The E6300 beats the X2-3800 both stock and overclocked... end of story
 
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.

Thats pure fact kiddo, do I need to post the processor specs little man?
Are you back to your usual flaming of people posting facts that you cant seem to grasp?
And no I didnt get banned, I keep getting an error about cookies and cant login so I made a new account, K nerd boy?

Wow, banned by a few cookies, huh? If we would've known that's all it took... :lol:
 
Yeah, and even though I am a total Intel fanboy, it is biased towards Intel. They don't mess with the results themselves, simply the way they display them. While it may look like the 3800 is getting totally PWNED in the benches (which it basically is), it's not as intense as it may seem. The graph markers are sometimes way into it to start out so the visual difference is bigger. Like this:
sandrawhet.gif


if you look at it, the actual numerical difference isn't that big, however when graphically presented, it seems much larger. While it ocul dbe argued this is to save space, not all of them do it, only the ones with narrow margins, which again could be argued that they're trying to show a higher-detailed difference. Still though, seems a little one-sided, though I will add that AMD is indeed getting its ass kicked.
 
Yeah, and even though I am a total Intel fanboy, it is biased towards Intel. They don't mess with the results themselves, simply the way they display them. While it may look like the 3800 is getting totally PWNED in the benches (which it basically is), it's not as intense as it may seem. The graph markers are sometimes way into it to start out so the visual difference is bigger. Like this:
sandrawhet.gif


if you look at it, the actual numerical difference isn't that big, however when graphically presented, it seems much larger. While it ocul dbe argued this is to save space, not all of them do it, only the ones with narrow margins, which again could be argued that they're trying to show a higher-detailed difference. Still though, seems a little one-sided, though I will add that AMD is indeed getting its ass kicked.

OMG i was just gonna say that, By the looks of it, it seems that the Intel More than DOUBLES the performance here. Which cleary isnt the case.
 
superpi.gif


Once again the 3800+ X2 is smoked by the E6300 which is 33% faster in the Super Pi test. The E6300 should really blitz through aplications which are FPU dependant.

You always only read the pictures in a book, if you read the words with comprhension skills, you might understand the pictures better...
 
I am not sure I trust this review. On the games, they used Fear and Doom 3 @ 800 x 600 (ugggghhhhh) on Medium setting using an ATI x1800xt.

What can the processors do on high setting. on 1024x768 or 1280x1024? Since a lot of people are going with Flat Screens now, these are more typical screen resolutions. Does it change the outcome therefore they do not show it?

I have 3 computers in my house and I have a mixture of AMD and Intel as well as ATI and NVidia. I look at Performance/Price and I take into account the natural screen resolution of the flat screens in my house before buying.

Just my 2 cents

Ummm, at that resolution they are transfering more of weight from the GPU to the CPU. Obviously any decent graphics card today can handle those games at a low resolution. So the major difference in performance at that resolution would strictly be processor based. Which is the exact reason they benchmarked in 800x600.
 
Yeah, and even though I am a total Intel fanboy, it is biased towards Intel. They don't mess with the results themselves, simply the way they display them. While it may look like the 3800 is getting totally PWNED in the benches (which it basically is), it's not as intense as it may seem. The graph markers are sometimes way into it to start out so the visual difference is bigger. Like this:
sandrawhet.gif


if you look at it, the actual numerical difference isn't that big, however when graphically presented, it seems much larger. While it ocul dbe argued this is to save space, not all of them do it, only the ones with narrow margins, which again could be argued that they're trying to show a higher-detailed difference. Still though, seems a little one-sided, though I will add that AMD is indeed getting its ass kicked.

I'd have to agree that the graphs are very misleading... Personally all graphs should start at 0, thats probably the 'fairest' way of displaying performance.

I am not sure I trust this review. On the games, they used Fear and Doom 3 @ 800 x 600 (ugggghhhhh) on Medium setting using an ATI x1800xt.

What can the processors do on high setting. on 1024x768 or 1280x1024? Since a lot of people are going with Flat Screens now, these are more typical screen resolutions. Does it change the outcome therefore they do not show it?

I have 3 computers in my house and I have a mixture of AMD and Intel as well as ATI and NVidia. I look at Performance/Price and I take into account the natural screen resolution of the flat screens in my house before buying.

Just my 2 cents

This has also been on my mind as well, but think of it like this. Obviusly games wont take full advantage of conroe even if your running sli or crossfire. But games in the future sure will, especially with physics becoming as big as they are. I say conroe will stay a good choice cpu for awhile, while AMD's procs slowly fall behind... that hurts because I used to be a fanboy for amd. My proof? Look at benchmarks in games at low quality. Conroe spanks. Usually 25-30% better than FX-62. AND conroe has lots of headroom. If anything it should be interesting to see what unfolds.
 
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.

Thats pure fact kiddo, do I need to post the processor specs little man?
Are you back to your usual flaming of people posting facts that you cant seem to grasp?
And no I didnt get banned, I keep getting an error about cookies and cant login so I made a new account, K nerd boy?You don't need to worry yourself about that....you aren't a person...you're a bot. The Bytch doesn't know about cookies. Go figure. Better spend more time in the kitchen, and less time thinking up stupid one- liners to post in here that you think make you look knowledgable. Back to the toilet bowl Bytch.
 
It was meant as a joke. :lol:
Forgot the smiley, sorry. :wink:

Yes, we do need them to stick around since they've "inspired" Intel, biting them squarely on the A$$.

Competition is a good thing. :)
 
Did you get banned? Thats a shame. Good to see you're still spreading your usual BS.

Thats pure fact kiddo

Hmm. Are you mikes boyfriend? How come you're a guy and use the Mrs title?

lol I don't know why but that made me laugh - but it did, so I rated it.

Yes some of the graphs are misleading (very badly done in some areas), but the end result is the same. The X2 family is outclassed in just about every single area by Core 2 Duo. Arguing about it will not change that fact you just have to look at the plethora of sites showing benchmark after benchmark showing this.
 
I like how Mrs. D's remarks are:

whole lotta cache

and:

65nm

as the reasons Core performs so well. NEVERMIND this review is talking about the 6300, which has only two (2) MB of total L2 cache, the same as the Pentium D series and the Althon FX dual core series. I also don't see how the die shrink comes into play here. If he is trying to make some inference on how a die shrink allows for higher clock speeds, this is the POOREST example for which to use that argument. The 6300 is running at 1.83 GHz, which was easily attainable using Intel's 90 nm Yonah core. The die shrink itself had no hand in making the E6300 the monster it is, the design had everything to do with it. Too bad an ignoramus like Mrs. D will never understand this concept. :wink:
 
I like how Mrs. D's remarks are:

whole lotta cache

and:

65nm

as the reasons Core performs so well. NEVERMIND this review is talking about the 6300, which has only two (2) MB of total L2 cache, the same as the Pentium D series and the Althon FX dual core series. I also don't see how the die shrink comes into play here. If he is trying to make some inference on how a die shrink allows for higher clock speeds, this is the POOREST example for which to use that argument. The 6300 is running at 1.83 GHz, which was easily attainable using Intel's 90 nm Yonah core. The die shrink itself had no hand in making the E6300 the monster it is, the design had everything to do with it. Too bad an ignoramus like Mrs. D will never understand this concept. :wink:

Hehe.