Q6600 isn't real quad?

Page 13 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


In all fairness, AMD has ALWAYS been the cheaper processor, even during the P4/A64 days when AMD did, in fact, have the speed advantage.
 


If that was the case... I'd have a X2 4400+ ($490 the day it came out) rather then a E4400 ($126 the day it came out).

So... I disagree that AMD ALWAYS been cheaper. Funny how prices are governed by how fast they were back in the day. AMD would have made more if they sold them in the $200 range, and would have more money today, or at least they would have had my $200 bucks.
 



I agree that AMD has always been competitive on price/performance. That's why I owned an Athlon XP 2800+, Athlon 64 3200+, and currently own a S939 (probably more to do with already having a S939 mobo).

AMD is still competitive with Intel, but only in the budget market. Great for consumers looking for less expensive machines (the majority of people) but bad for AMD stakeholders because they're not making money. Also bad for hardware enthusiasts, as AMD failed to do what they always did in the past, which was best Intel. I remember when AMD came out with the first FX processors and how they kicked total Intel butt.
 



I believe you can. You go back to the original post and do the full edit (not quick edit) and then check the delete post button. I'm certain if you can do it if other people have replied, but I think you can.
 


But then he would delete my favorite Kassler post:
If you would try an AMD and feel the smoothness you will understand what I mean.
 
Yes it's a real quad core. How could it be fake? There are 4 cores, and they're all in 1 processor chip. What's intresting is, Intel put 2 Core 2 Duo 2.4ghz and stuck them together to make 1 chip, instead of 4 Pentium 4's. It is better than 4 pentium 4's because of it's so called "core 2" archetecture.
 


First part I get. Second part what?

Core 2 is based off of Core 1 (hence the 2 part) which is based off of the Pentium M which was based off of the Pentium III Coppermine and that goes all the way back to the Pentium Pro.
 
Shhh... Jimmy. Next thing you'll have the AMD fanboys in here complaining about Intel using "10 year-old technology" in their new chips. The only sad thing about that statement that they don't seem to realize is that so called 10 year old technology is besting the latest efforts from AMD. Don't get me wrong... my last last computer was an Athlon 64 3800+. Before that, I had a P4 and before that, I had an Athlon 1900+. I don't have any special dislike for AMD... unlike those that harbor hatred for Intel.

If you have any sort of emotional bias one way or the other... there's no way you can present any sort of rational argument for or against either company.
 


The latest computer I built was an E7200 with a MSI 7150 chipset-based motherboard... An AMD Athlon 4850e (with simliar power consumption) + 780G cost $20... but that glorious 3MB L2...

Now after spending so much time here, I realised not to favour a company too much... besides its YOUR money... (well not really (In my case...))
 
The bottom line is, if you want an AMD CPU, no one will stop you from doing that. If you want an Intel CPU, no one will stop you from doing that.

But when someone, or rather some people spread misinformation, and deliberately trying to mislead people, then something needs to be done about it. After all, forum is to educate people, not mislead people.
 



Agreed. I think that's what really gets these flamewars going. I'm not an Intel fanboy (but you can claim I am all you want), it just drives me nuts to see mis-information out there when people come here for help.

AMD is great for people who want a really cheap dual, or a budget quad core ($200). However, if you need more performance than AMD's fastest chip, Intel is the only game in town. I'll give credit where credit is due, AMD has a couple good offerings; X4 9850BE and X2 5000 BE, but only one of them has a lot of headroom for OCers, the X2 5000 BE.
 
^ Still AMD's quad are pathetic. An Intel Core 2 Quad Q6600 costs less than similar performing Phenoms!
Q6600
http://www.newegg.com/Product/Product.aspx?Item=N82E16819115017
9750
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103250
9850
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103249

Now I understand that chipsets may be a driving factor...

w00t I'm going Intel! Now I'm sure Reynod needs to inject me with long overdue AMD kool-aid

Then I'll wonder in eternal bliss in knowing that Intel has no desktop 65W quad-cores or 45W dual-cores
 

I would go AMD for a cheap dual, but never for a quad. As amdfangirl pointed out, the Q6600 is cheaper than any phenom that can hope to even approach it, and it is dropping like a rock in prices now that the mainstream 45s are out.

On a side note, as far as low power is concerned, Intel may not have any rated at 45w, but I would be willing to bet that some of the lower power Duos could easily hit that spec (E8190/8200 for example). Also, I wouldn't touch those AMD low power quads with a ten foot pole - they appear quite buggy from the reviews I've seen.
 
^ I still don't get why they fuss over so much about the Phenom. At one point in time the X3 was more expensive then the X4 and C2Q...

Besides all B3 Phenoms are not buggy but tend to evidently pop MOSFETs as if it were a game. C'mon use some common sense people! Are we as rich in tech as Anand to afford this? Probably not!

We people are smart, we can think, calculate. Simply just putting up with extreme TDPs is unacceptable. Instead would-be Phenom buyers should turn their attention to the Opteron Am2. With only 75W TDP and a similar price, I'd recommend this to AMD'ers, but personally I'd stick with my E7200 and BE-2400. Reviews of products are not always accurate, for at the time most people wrote their articles on the Phenom was when it actually was less than a Q6600!

Price changes over time, and so does demand, supply and chocolate tastes better. Most people just read an article, absorb the conclusion and spit it out as advice. Read, compare and think before you recommend! (Now I'm not giving any names). Also I might as well rant on is how a product gets reviewed well and everyone tells you to buy it. Did a thread on whether I wanted an AMD or Intel PVR-type HTPC which used alot of encoding. I would think that most IGPs would be good for it... then I found the E7200 + Geforce 7150 was only $20 more than the AMD system. It could support up to 108i, my BF's TV was supporting only to 720p I believe. I asked them if the E7200 + Geforce 7150 was enough for the 720p HD stuff and everyone went 780G... 780G... (I wanted a faster processor rather than a better chipset, but I would have liked both =)( I ended up getting the answer from MSI's product page (I'd prefer real advice, but this was the closest I got )))Now advice is well and good but I decided that I was gonna get my BF to buy the E7200, so I asked the question in the thread... no answer and just decided to post a compatibly thread in the motherboard section... Tom's is usally helpful otherwise...

I didn't phrase the question well tho... so its kinda my fault...

Here is a common mis-conception. How can a 45W TDP 4850e consume almost the same power as an 65W TDP E7200? Well TDP = Thermal Design Power, how much heat is generated (W) not energy draw. Besides from knowledge the E7200 undervolts better than the 4850e 'cause its 45nm and Core based. I'd like to see AMD beat that! (undervolted tho)

Rant... over ... must drink green kool-ai...XP
 

Yes, but it's not the capacitors, it's the mosfets that are burning out. Simply too much current draw for a 3 phase VR circuit. Interesting we didn't see this same issue come up during the short -lived D805 overclocking fad.
 


Intel's Pentium D was never that high in TDP tho... but still, this popping occurs in 780Gs mainly... and some matx Nvidia boards... must be cheaper manufacturers...
 
Plus, from what I hear, an between an Intel rated at 65 watts and an AMD rated at 65 watts, the AMD will consume (on average) more power due to some differences in the rating system.
 




First... the SERVER chips are rated differently than desktop chips.

But... Oh... so predictable.

Help me understand: The same people that said that the Phenom at stock speed could compete with the Intel chip in the same price range... but the Phenom was not worth buying because the Intel could overclock "so much better" are now trying to tell us that they are concerned with how much power the Phenom uses...


BTW: How much does a Q6600 or even Q9450 actually use at 3.6Ghz? 180 or 200W?

Perhaps we actually need to know that small fact before we "go ballistic" on the stock Phenom 9950 rating. Especially since people seem to be able to get the Phenom to 3.0 or 3.1GHz on stock voltage. How much "more" will it use at 3.6Ghz? Or 4.0Ghz? We do not KNOW yet.


Seems people are more worried about something else... and it is NOT how much power the Phenom uses.

Seems we are seeing a double standard.

"but anybody can buy a Q6600 and overclock it to 3.6Ghz! My grandma could do that!"

So. Quesiton: If we look at how much power both chips use at 3.6Ghz... the "OMG... 140W" people are reaching for something to hold onto.

Or perhaps... in reality AMD is actually pointing out a simple fact... they are being conservative. Something that many people do not want revealed.

OMG. I am so going to be here in these forums when somebody hits 4.0Ghz with a Phenom. It will be even better if it uses less power than an Intel Q6600 at 4.0Ghz. and OMG2: Do you realize that some people have noticed that the AMD design gains more per clock than the Intel at higher clock speeds. Let's find out shall we? But then perhaps the Intel fans will have moved on to the "But look at how the Nehalem does... blah blah blah blah."
 


Uh... no. Server chips are rated no different than desktop chips or mobile chips.

Especially since people seem to be able to get the Phenom to 3.0 or 3.1GHz on stock voltage. How much "more" will it use at 3.6Ghz? Or 4.0Ghz? We do not KNOW yet

A Phenom 9950 running at 3.6Ghz (calculated) will likely consume 264W, 4.0Ghz will likely consume 337W.

Comparatively, a Q6600 running at 3.6Ghz consumes 175W, 4.0Ghz consumes 192W. You do the math.

By the way, TDP is the power required to cool the chip, not the power consumption.