Q6600 isn't real quad?

Page 17 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
So let me ask yet AGAIN: What Data or Proof do you want from me? I do NOT need to disprove his results since he can't prove his input parameters. I did NOT claim a result... so I do not need to show you proof of how a result was reached.

You disagreed... that is all that is needed for you to produce some sort of data to refute his claims. If you're not going to produce any data, then your claims are as baseless (or even more so) than his. I'm not trying to change the subject, you're the one that decided to run with this by posting your dissension.

Getting back to the original topic... yes, Q6600 is a real quad-core processor... just not a "native" quad core.
 
I love these rabbid AMD fanboys. They try to tell us why a slew of objective benchmarks from multiple sources that are well respected and credible are no good, and then use their own SUBJECTIVE experiences to say that AMD is better.

Guess what? INTEL HAS BETTER DESKTOP PROCESSORS! There is nothing you can say or do to change that fact. AMD's is still a great option for many people, and many of us won't recommend against them, but you've got to be realistic.

Gosh darn you two are dense.
 

This isn’t strange when you have tried one Intel compared to one AMD. AMD runs smoother and that’s what you will notice when you try them. Of course you need to run some demanding applications.

 
^ Dats fine... please buy as many AMD products as you can, to help them along. We need somebody to buy their stuff to keep them in the market.

After all, we are just a hand full or less of people that wanna tweak, enhance our PC to its better/full potential.

Not to mention mown, groan, and bitch about the 2 companies.
 


I personally doubt we'll see any significant improvement out of it, maybe an additional 100Mhz or so. Phenom is already at its maximum clock envelope, and pushing it furthur simply is suicide clocking.
 



Seriously, you're full of sh!t.
 


It runs smoother, it is cheaper, there isn’t a socket change nearby so the parts I am buying will last a bit longer, if new games is going to use ray tracing it phenoms will be able to run those.
Here you can say what Intel is saying to programmers: http://blogs.intel.com/research/2008/06/unwelcome_advice.php

It isn’t a hard decision choosing AMD if you have some knowledge about computer hardware

 
And why are you explaining that to me? I've already spent my money, and I'm happy.

I find my Q6600 runs smooth, with a slight OC, and I'm running Vista 64bit.

"It isn’t a hard decision choosing AMD if you have some knowledge about computer hardware "

It is not a hard decision in choosing Intel as well. Your choice of a double whopper with cheese, or a whaler/fish sandwich.

So far, I think the majority are meat eaters. :lol:
 
I find the comments about developers not writing codes to take advantage of multi-core particularly accurate. We have quad cores now, and octa-core in a year, and most programs out there still are written for single thread.

M$'s FSX developer once said, its a lot easier to write codes to take advantage of single thread performance, than multi-thread performance. There was also one developer going berserk on how single-thread codes is a lot better than multi-thread code (he is obviously a tool).

Chip manufacturers and developers need to get together and come up with a solution. Intel and AMD already steadily increased their IPC even as core counts increase, so its time for developer to come up with something to take advantage of both of them.

It isn%u2019t a hard decision choosing AMD if you have some knowledge about computer hardware

It also isn't a hard decision to determine that you have little knowledge about computer hardware.
 

Do you understand why Intel made nehalem? If you understand that, could you explain?
 
Intel made Nehalem because it was the logical choice to move to IMC, as well as native multi-core. But before that, most programs do not take that much bandwidth, including server applications. The only applications that would benefit from IMC and HTT are scientific programs, heavy FP calculations, and memory intensive programs. This is why while AMD dominates the 4P+ market, they still lose to Intel at 2P, 1P segment.

If you understand so much about computer hardware, can you please explain to me why AMD went native quad core at 65nm?
 


The only PROOF I could give would be to make up some input parameters that are just as incorrect and then show you a result. Somehow it seems that would make you happy... even though the numbers would be just as faulty as the orginal.

Ok heres a calculation for you: the results would be half of what he said because I'm cutting his input variable to half his quoted values.

Now here is the part you need to understand: MY CUTTING THE INPUT VALUES IN HALF IS JUST AS VALID AS WHAT HE USED FOR HIS INPUT NUMBERS. His input numbers were guesses extrapolated on what he thinks is correct. But if you start the extrapolation on a value that is wrong... then your input value is wrong. And your result is useless.

Is this sinking in yet? I very much doubt it... and even if you completely understood awhile ago you are still going to use your same boring and tired excuse to divert attention.

So I'll go back to my original comment: I don't feel the need to "prove" anything since your burgeoning intellect won't grasp reality anyway. (Oh wait did I say that last sentence in my "out loud" voice? OMG.)





I love these rabid Intel fanboys. You show them that a slew of objective benchmarks from multiple sources are shown to be inaccurate... and they just want to ignore the facts and brush them under that table.

Guess what? INTEL HAS BETTER DESKTOP PROCESSORS! Except for the chips that are sold in the same price range as the AMD quad chips. They are NOT "better" except in your opinion.

In addition... it is humorous to see someone complaining about objective versus subjective... and then throwing the word "better" around... which is a SUBJECTIVE OPINION. Since the Intel and AMD chips in the same price range compete benchmark per benchmark then something else needs to be used. Some people use the basic chip architecture.

This argument is going to be twisted and morphed in the future because of the Nehalem. It is going to be very amusing.
 
Now here is the part you need to understand: MY CUTTING THE INPUT VALUES IN HALF IS JUST AS VALID AS WHAT HE USED FOR HIS INPUT NUMBERS. His input numbers were guesses extrapolated on what he thinks is correct. But if you start the extrapolation on a value that is wrong... then your input value is wrong. And your result is useless.

Thank you...for showing how little you know.
 



oooh! oooh! I know!!!! I know!!! It is because they don't have the budget to develop new process, or retool there fabs.

LOL
 


I know little bout what? Claiming incorrect data as valid and then using it in a calculation? That would be correct... I have little experience with doing that. I'll let others have the honor.
 
Let me re-explain again.

To quick reply, I used 1.4V for 3.6Ghz, and 1.5V for 4.0Ghz. Actually they are even lower than what a 65nm Intel quad required to achieve those speeds.

This is the site I used to calculate wattage.
http://extreme.outervision.com/psucalculatorlite.jsp


Then, I self-admitted that my voltage is too low, even for Intel's quads. Therefore I corrected them:
The more realistic voltage for 3.6Ghz is 1.45V, and 1.65V for 4.0Ghz. At that voltage, Phenom would require 407W @ 4.0Ghz.

In addition to that, due to Phenom's 25% thicker gates to prevent leakage, its even difficult to overclock. Therefore the figure I posted above needs to be increased as well.

My point in writing this is to prove that I did indeed provided input parameter, the result, and now, the method. Now, what have you provided that prove your absurd claim?
Now here is the part you need to understand: MY CUTTING THE INPUT VALUES IN HALF IS JUST AS VALID AS WHAT HE USED FOR HIS INPUT NUMBERS. His input numbers were guesses extrapolated on what he thinks is correct. But if you start the extrapolation on a value that is wrong... then your input value is wrong. And your result is useless.
This?

Or....
Providing questionable result values calculated by using questionable input values means exactly what in the real world? You can claim extrapolation... but you have to remember one simple fact: This is a new design that doesn't follow the same rules. So until it actually happens your opinion is just as valid as mine.

Or...
Ah... but you see I don't feel the need to "prove" it. The only thing that matters to me is that I know.

I know that bothers you. But I have difficulty caring.

I wonder, would anyone consider that as "solid proof"?

EDIT:
Then, according to the guy who overclocked his Phenom to 3.4Ghz,
http://www.overclock.net/hardware-news/346792-fud-790gx-overclocks-phenom-3-4ghz.html

Notice that the Vcore says 1.55V on CPU-Z? So if we take the voltage at face value, the Phenom requires 284W at 3.4Ghz @ 1.55V. Counting in the Vdroop, which would likely put Vcore at 1.6Ghz, Phenom requires 304W.
 


Oh... you mean the guy that just threw the voltage to 1.55V and didn't see if anything less would have worked?

Yes... you might as well use that value. It would be just as accurate as anything else you have presented for the Phenom.

That is to say... not accurate or valid.
 
So you simply disregard the result just because he didn't tweak everything to lower the Vcore by 0.02V?

Laughable....laughable...

You really think by tweaking, it could lower the wattage by more than 30W?

Thank you...for showing how little you know.
 
Man this is gotta be MadModMike we are dealing with, if any of you besides TC and yomamfor1 have been here long enough to remember. Arguing personal opinions against well accepted facts
 


I wouldn't care about a change of 0.02V. But what about going from 1.55V to 1.40V... that might be a bit more significant.

Can you irrefutably present data that this can not be done? Since many people are getting 3.0-3.1Ghz on 1.25V it seems very likely that 3.4Ghz will not require 1.55V.

Although I know you won't want to concede that since it will put your calculations into question. No it would better to use data that we KNOW is not correct in calculations to create an answer that WE KNOW is not correct. And then parade that around as something that is "true". (And attempt to discredit anyone that doesn't agree with your opinion.)
 



I can't present something that is irrefutably true and have it accepted as something that is relevant for other results.

Yet he can present a guess as a fact to base a calculation on and it is accepted?

Oh... I'm sorry... I forgot the Intel Fanboys accept double standards. As long as they support their opinion.

I'M SORRY... I WITHDRAW ALL OF MY ARGUMENTS. GO AHEAD AND PRESENT YOUR BAD DATA AS TRUTH.
 
time for a little TC MELTDOWN for me
"at least he is providing links to numbers, be they flawed or not you have provided nothing but ABSOLUTE BS!!!!"

End of meltdown

Sorry TC couldn't resist
 


Please explain how I would provide a link to show a value that is not known at this time?

Please explain how since the value is not known that it is acceptable for him to create a guess and have it accepted by people that consider themselves rational?

(Especially when there is evidence that shows that his "guess" has a high probability of being VERY incorrect.)