Q6600 isn't real quad?

Page 14 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Wow you are a very unhappy person. The reason why we talk about how easily the Q6600 OCs is because for the people here thats what they are looking for. The best bang for their buck. Now if AMD is being conservative then why was it that when tested at 2.3GHz (the 9600) that one used more power than a 2.4GHz Q6600? Then the 2.4GHz 9850 came out and has been show to use more power.

As for the Q9450, it uses less power than a Q6600 easily. Just like the QX9650 uses less power than the QX6850. Of course you try to forget the main factors there. 1. its 45nm therfore (and the whole point to a die shrink) is lower power consumption. And 2. it uses the new Hafnium HK/MG process which has been shown to help lower leakage.

Of course this is what Intel has said and what has been shown (even on THG review of the QX9650) on many sites. In fact let me post some nice little charts for ya cuz I'm just that nice:

This is Idle consumption of the chip
46-chart_power_idle.png


This is under load
47-chart_power_max.png


Now lets just take a look at a Phenom thats clocked lower than the QX9650. Rememebr I am just using it as a reference.

Phenom on Idle
idlepowerconsumption.png


Phenom Under Load
powerconsumptionunderload.png



Now the Q6600 is no where near as efficient as the 45nm CPUs but it still stays about 10 watts lower than most Phenoms clocked near or the same from most reviews I have seen. Of course I expect you to fight this but that chart clearly shows that a 9850BE (just 100MHz lower) uses 140w.

Now you talk about the 9950BE being able to be OCed to 3/3.1GHz on stock voltage. Well so can the Q6600 without the need for a new high end mobo. As for that issues, where does that leave the people who only have the AM2 or the AM2+ with the older SB?

I myself am not as concerned about power usage. That would be AMDFanGirl really. I myself was worried about bang for buck. And I got a Q6600 with a nice P5K-E mobo that has the P35 chipset and was able to OC it to a 3GHz chip while spending less than half for a equivalent 3GHz C2Q. If I were to have gone the Phenom route, I would have gotten a Phenom. Then I would have had to have gotten a B3. Then if I wanted to OC the CPU to a good waule (not just 100MHz) I would have had to gotten a new mobo with the SB750.

Seriously man lighten up and be happy. And try not to be a fanboy. You sit there and criticize anyone who likes Intel or uses them now because they show the best performance yet you don't look at yourself. Your are such a brand loyal dog you are blind.
 


Thank you for basically confirming my post. It stays a whole 10 whole watts lower? Oh so tragic. <rolls eyes>

As for being happy... I am very happy. What you don't understand is that I never criticize somebody that has chosen to buy an Intel chip. UNLESS they come here and try undermine my choice because I don't agree with them. Yet another double standard? I'm laughing out loud with joy at the people who honestly think they can have double standards and not have somebody point them out.

And please note that I have "looked for myself". The performance is roughly the same at stock between the brands for chips in the same price range. I chose the better design. You may not think it is better... but some of us do. When I say "some of us" that includes a small company called "Intel". I know you guys really hate it when people point out that little factoid. Perhaps since it slightly undermines your defense of an outdated design.

Happy? I'm joyful. I really enjoy coming to this forum to point out the inconsistent and sometimes outright untruths perpetuated by people who are often blind and have double standards. Of course in doing so people label me a "fanboy". I find it extremely amusing that many of you don't even realize that you fall into that category yourselves. Let me be more blunt: If you have ever called anybody a fanboy... then you are a fanboy. Enjoy the moniker.

[:keithlm]
 
I won't call you a fanboy; you're a pot-stirring troll.


 
Hmmm.... If you want to call me a fanboy then so be it. I have no respect for what you say. I have plenty of reasoning.

A fanboy in my mind is someone who no matter what even when the exact same benchmarks used for the past 10 years (3DMark, games and so on) show one chip has better performamce you try to tell people it isn't and this and that. Thats a fanboy. Kinda like yourself.

No you don't critisize people who bought one. I don't critisize people who bought AMD. In fact I was someone who admitted AMD was plastering Intel before Core 2. Unless there was a certain area where Intel did better I knew AMD was better.

But what you do do, and this is how you get away with being able to say you don't critisize people, is that when someone asks which is better a Q6600 or a Phenom 9850BE you dispute everything good about Intel. You either say the design is not as elegant (we get it already) or the FSB is this or that the benchmarks are all made to run better on Intel.

Thats what you do. The one thing I will say that you do that truly urks me is that you say a C2Q cannot multitask well. Yet you don't own a Q6600. You say this based off of info you have read (I don't know where) yet if a person like myself who has one and states it does you argue it doesn't.

So call me a fanboy.I have no loyalties except to my country and family. Oh and Mountain Dew. That soda rocks.
 
Very well put.


 



And the Q6600? Oh... we'd better not compare that... it might ruin your point. Or lack thereof.

Sure you could argue that we should compare what is available NOW. But many people tell others to get a Q6600 instead of a Q9300 or even Q9450 because it overclocks better. But when the subject of power use comes up suddenly the 45nm power use is quoted.

Oh... I'm sorry. I forgot that Intel fan-trolls use double standards. So I guess we'll just let that go and not bother mentioning it.

I find your argument boring and not very compelling. <yawn>

(Gads I just looked up how much a Q6600 uses at 4.0Ghz. No wonder you don't want to bring it up.)
 



LOL. I love you Keith. Keep doing that thing you do, it is amusing to watch someone defend a company like its firstborn.
 
So keith....would ya mind posting that link there to show how much power a Q6600 uses at 4GHz? Considering this is a rare number only for the extreme OC'ers who use water cooling that I would recommend a 45nm for either way due to lower power usage.

Or better yet I will look it up. One problem though....we can't even get a Phenom to compare it to yet. So it kind of ruins the point. But I still find it funny. A 45nm @3GHz uses almost half of what a Phenom @ 2.4GHz uses. There is a 600MHz difference.

Of course I am sure you will find another way to say "The device used was made to work better on Intel" or some crap.

Oh and in terms of power consumption 10w (sometimes it is 15w less than the Phenom 9600) will add up. It seems like nothing in the short run but calculate that in the long run and see what happens.

BTW, the chart I have listed shows the QX6850 @ 3GHz. Thats what you can expect to see my Q6600 @ 3GHz using . Compared to a Phenom 9850BE @ 2.4GHz thats a 20watt difference with a 600MHz clock advantage to the Q6600.

Oh and keith I searched and only found this review:
http://www.tweaktown.com/articles/1159/2/overclocking_the_g0_slacr_q6600_to_4ghz/

I do hope you know thats total power consumption (means it includes the GPUs and CPU and so on).
 


Good. Because I have no respect for you and many of your troll posts.

You are correct. I do not look at benchmarks and look for a "winner" and "loser". Benchmarks are tools to gauge relative performance. When performance is about equal then the benchmark shows that performance is about equal. When a series of benchmarks shows two chips to be very relatively close then that is all that you need to know. If you know that the results are relatively close... AND you know that one chip ran the benchmarks with a handicap... then an intelligent person can make their own conclusions. Sadly these conclusions won't coincide with your opinion or agree with what you want to see.

In other words you want me to look at the benchmarks in 2D when this is a 3D world. I can't dumb myself down enough to accept benchmark results as a black and white "winner" and "loser" measurement. It doesn't help the matter that often a less than credible reviewer creates a summation that declares a victor, often contrary to the actual benchmark results they themselves created.

AND I do not dispute what is good about Intel. I just do not accept the opinion of many on this forum when it is contrary to my actual real experiences.

You are correct. I do not own a C2Q. I base my opinions on the architecture and my experiences between them. At work I have Xeon and Opteron workstations. When we run monthly production runs we always use the Opteron workstation because it will get the job done faster. People always seem to discount this fact because they are "servers". But what people do not want to realize is that workstations, although they have "server" chips, often run desktop applications. The Opteron is much more responsive while the workload is running than the Xeons. The Xeons are sluggish while running the load. The Xeons are newer and clocked higher.

CONVERSELY what "urks" me is that people like you come here and try to tell me that my actual real experiences are not valid. Since benchmarks do not reveal these aspects... we have to rely on personal experiences. My experiences include running more than what your average desktop user runs. Try running a database server, a web server, a job server, a data integrator server, LDAP server, and then use a few java based GUI tools along with a few browsers and monitoring utilities. (Sadly some of the GUI tools use different versions of Java. I hate that... but that is just something you have to deal with.)

OH and BTW; I don't really care about your 45nm to 65nm power comparisons. They seem very unimportant. When the 45nm Phenoms come out then we'll have something to compare.
 
Hehe. I am a troll. When you stick up for and agree with people like kessler who bust into a thread asking about a Q6600 chip and starts ranting about Phenom being better blah blah blah.

You are right. The benchmarks are not really for a winner or loser. More for a good look at how that specific CPU performs. But if one CPU consitently outperforms another then wouldn't you agree that that CPU is a better choice? Especially if they arethe same price range?

I don't want you to. I want you to stop with the conspiracy theories. Its just annoying. And yes you do. Every thread you are in, especially when its comparing benchmarks showing a C2Q vs a Phenom you try to discredit them. The Nehalem one (even after it was explained) you said had to be something wrong yadda yadda.

Hey I never said your real experiences are not valid. But you are doing the same to me. You try to say a Q6600 doesn't multitask well. I tell you it does and you say it doesn't. I am the one with a Q6600 but I am wrong in that sense.

Um you must have missed it but I used the QX6850 (a 65nm part) and compared it to a Phenom 9850BE. Using the QX6850 as a idea of power consumption for a Q6600 G0 (all of the Q6x50's are G0 steppings) thats a valid 65nm vs 65nm comparison. But to say the least I compare within price ranges not technology or process. I mean we never have said (ATI has a 55nm GPU lets wait and compare it to nVidias 55nm GPU when it comes out".

You wait if you want. If you still believe that SOI @ 45nm will work then you wait. I still think its going to be useless.
 


Nah... I use the word all the time anyway.

Actually today I'm starting to get bored... waiting for the end of work... nice long weekend ahead...

I wouldn't normally have posted my last response. But nothing is happening here today... besides he gets on my nerves sometimes. Him and a few others. You don't. I base that on the fact that I don't cringe when I see your name... you usually have reasonable things to say. This guy doesn't. Perhaps it's because I've seen him post some trite (tripe?) responses.



Yes. You are wrong. I didn't say it won't multitask well. But I do say it will not multitask AS well. Big difference. (And I know people that DO have both systems; they agree with me.)

Yeah... I didn't see the 6... thought you meant the 45nm.
 
That may be their own preference. I myself have seen a Phenom while it was multitasking and then compared it to my own system. And I can tell you they both look the same to me. Of course there is no real way to test this except run a encoder and maybe some other programs.

But that will only show the CPUs performance not the ability to multitask smoothly.

Oh and I like DR Pepper too.
 
Today I have spent some time with a friend that is running linux with C2D. We where installing some applications om a vmware session running windows xp but it took some time. So he wanted to show quake wars on linux. We switched to the main linux installation ande started the game. it was slooooooooooooooow. after 5 minutes the computer crached.

I use vmware everyday, sometimes two different sessions and surfing in windows. it runs very smooth, no problem what so ever. ok I have a phenom (cuad) 9750. the difference is like night and day
 


So you are comparing a dual core to a quad core now? Dude how do you know his Linux installation wasn't just crap or some code went bad? You quickly blame the CPU and yet there is no proof. I have seen peoples machines crash, BSoD, heck I have purposley caused crashes and guess what? I have done them on both Intel and AMD (yes with the lovely IMC) machines.

I just wanted to say this: Linux/Unix is not the perfect OS. There is not one either. I have made several versions of Linux/Unix crash (fedora, redhat, kubuntu) easily. I have also had the crash/freeze while loading and then not start.

Seriously man everything that comes from you is just utter crap.
 


Today, I spent some time with a friend who was running a Packard Bell with a PII. We installed Sid Meier's Pirates! after loading up some Tetris, with DOS, but it took some time. So he wanted to show me the great new 8 bit graphics of his system, so that is why we installed Pirates! We switched to Win95, and started the game. It was sloooooow. After 5 minutes his entire system blew up.

What exactly is the point of your story except to prove that you have a friend?
 

Of course there is a difference (phenoms is so very good at multitasking). But we was just installing one application. that isnt that demanding...
and the computer crached!!
 



LMFAO. I think I just wizzed myself.

I live how they contest every benchmark from reputable sites we give them but then he cites his story as shining proof that AMD is superior.
 


Selective memory?

A Phenom 9950 running at 3.6Ghz (calculated) will likely consume 264W, 4.0Ghz will likely consume 337W.

Comparatively, a Q6600 running at 3.6Ghz consumes 175W, 4.0Ghz consumes 192W. You do the math.

 

Hmm, yes but there wasn't just the crash, it was also very slow. I know that a quad is better than a dual when it comes to multitask. But I have seen wmware on amd x2 also and it hasen't been a problem. This was o f*****g joke