Q6600 isn't real quad?

Page 16 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
If true, then that is some very good news for AMD... and they need a lot of good news right now. I don't want to see them fail any more than you do... it would hurt everyone if AMD were to go under. They do need something that can at least match Intel, if not beat it on the desktop. I've never had an issue with AMD processors... I just need to learn to stay away from VIA chipsets... lol
 


Maybe because... the rumor is just a rumor? AMD was "rumored" to develop Reverse Hyperthreading, does that make it real?

The only thing I've heard is improved overclocking, which may or may not be true (I'm leaning more on the not side...but we'll see).
http://www.reghardware.co.uk/2008/06/19/amd_overdrive_extreme/


You're joking right? I agree that SB may provide much stabler clock when overclocking, but connecting HT voltage or even CPU voltage with SB is just plain absurd. HT voltage is regulated by the CPU, if I remember correctly, and CPU voltage is regulated by MOSFETs and capacitors on the side of CPU socket.



Again, you're kidding right?
http://xtreview.com/addcomment-id-5605-view-SB750-and-AMD-phenom-overclocking.html

1.55V for 3.4Ghz, and not counting the Vdroop. You're looking at approximately 1.6V for 3.4Ghz. Improvement? Yes, but definitely not on par with Q6600, which can do 3.6Ghz with 1.45V.
 


Memory again? Here's a nice little test using DDR2-533 and DDR2-1066.

http://www.digit-life.com/articles3/cpu/amd-phenom-x4-9850-ddr2-533-p2.html

Let me just give you the conclusion of the tester:
Conclusions

DDR2-533 DDR2-1066 Gain
Professional score 87 90 3%
Home score 89 92 3%
Overall score 88 91 3%


In fact, there is no need to write a lengthy conclusion. The bottom line is crystal clear. Here is a summary:

The biggest performance gain in groups of tests: games, archivers (6%).
The biggest performance gain in applications: Unreal Tournament 3 (22%), World in Conflict (10%), 7-Zip (10%).
Average performance gain - 3%.
There is even no need to comment on the last fact. I remind you that memory frequencies differed twofold. Twofold difference on one hand, and 3% performance gain on the other hand - do you need any comments?

So, if DDR2-1066 has a 3% average performance gain, over DDR2-533, what would the difference be for DDR2-800? I imagine, less than 3% average gain difference.
Oh, and the testbed motherboard was ASUS M3A32-MVP Deluxe. Just so you know which chipsets were used.
 



Theo.png
 
Well this is getting interesting. I am still unable to find any site with a Q6600 OCed to 4GHz with just the CPUs power usage. But I shall keep looking.

Only because its so dam hard for keith to actually post a link for once instead of trying to make it sound like what he says is true when we know its probably all BS.
 
He won't post a link or any sort of numbers to back his claims because he fears scrutiny.

And just so you know, Keith, I haven't accepted Yomama's numbers as gospel truth... but without any kind of other data to go on, it's kind of hard to believe what you're saying as well.

If you're not going to provide proof, don't bother disagreeing. No one, with the exception of (k)assler, is going to believe you any more than you believe anyone else.
 


We were checking the memory consumption because it was slow, it didn’t use a lot of memory. I don’t know of course if this was a linux issue but it would be strange if it was. We used ubuntu.
The reason why we checked the game was that it took some time, also the vmware machine running windows was configured for two cores.
If someone showed me a solution where this type of setup runs smooth compared to phenom with no massive overclocking done it would be nice. Intel Quad’s also doesn’t run this type of setups as good as phenom.
The thing is that Intel users have hard to understand. It is big difference running simple test running one single application compared to real usage of a computer.

 
You're drawing baseless conclusions, (k)assler. There are many factors to consider other than the CPU. As was said, it was most likely due to the vid card or it's driver. Linux is far pickier than Windows when it comes to drivers. I know my old 9700Pro didn't seem nearly as fast under Linux as it did under Windows.
 


No. They kept TDP as TDP but also publish the "average" consumption figure alongside it.
 


How much does a video card do in a VM? Nah, anyways my theory is that it was 'more' optimised for AMD's AMD-V... rather than Intel's...

Still... Vmware... AMD... Linux... =?



Post your config Kassler! Was it vanilla?



The Barclona Arch is good for this kinda stuff because of the fast memory switching, AMD - V, HT - 3.0 and etc. AMD excels because VMs use alot of bandwidth compared to normal applications and, in theory work better on AMD systems. VMs are memory hogs because they must contact both OSes. As AMD integrated the Northbridge memory is much faster and HT-3 scales better at 2p+ configs because of this... Also why Optys are so good compared to Xeons in memory applications. In fact higher clocked memory prevent bottlenecks of the RAM allowing the processor to process faster.

Seriously I agree with you, why do they only give out single applications in the world of multi-core? In my theory AMD processors would do better as it is more bandwidth intense. How about you?
 


But I have eyes...
Now I am running two vmware machines. One that I am developing in and one where i test applications. Also I normaly watch news or reading in the browser. That is in the main (vista) system installed on the computer. Never had any problems, runs very smooth using phenom.
 


I make these Intel'ers believe, post some configs! That'll teach em! I got my Athlon X2 for its AMD-V feature which the equivalent is only found in the higher end, unlike AMD!

I love playing GNOME soduku on a Fedora VM!
 


I think this is very strange, I don't by a quad to run single threaded applications. I don't buy a quad for running ONE multithreaded application. I have bought quads to run many multithreaded applications. When I compile I can go to another application and to other stuff there. Switching to clean machines doing some testing. I am also running a lot of databases, some are very big (more than 1 000 000 records i different tables). In the mean time I may have 5 browsers with 5 browsers in each browser. Also I am using two 30" dual screen
 
He complained that Quake was slow in Linux... which is most assuredly affected by the video card and it's drivers. Vmware makes no difference, but what you're running within the emulated session certainly does. For his conclusion to be valid, we need more information... such as the specs of the computer and what vid driver is being used under Linux. Simply stating his specs and claiming "it runs very smooth" just isn't enough.
 

NO!
Running quake alone is no problem! He playes that often. Now we installed one vmware machine on his computer. That was when he showed me quade because he plays that game sometimes
 
Sorry (k)assler... peddle it elsewhere... if it were Windows, it would be one thing... but with Linux there can be any multitude of mistakes made. You immediately blaming the CPU with no kind of troubleshooting at all is simply incredible.
 

After we hade installed the vmware machine and all applications we started to do some testing with the real work. That was windows in a vmware machine. Don't know if it some problems there because of vmware but it was also very slow. It was in fact so slow that it was problematic to work in the environment.
 


Yup, but there are multi-tasking benchies for lappy batteries... hmm...



Anyone ever tried WINE? Its virtualization! How many of you have ever tried Direct X 9.0 on Vmware Fusion? It doesn't work as of yet and the emulated graphics is about the same as a Geforce 2 or even a late Voodoo card! No 'flashy' stuff such as a minimising window even appears there! Sure its good for soduku, but not for any 3d gaming!!!
 
So.. we're comparing a chip with VM to one that may not have it? And then drawing baseless conclusions when we don't have complete data?

I see.

Don't follow the same path as (k)assler. At the very least he should have compared an Intel quad to his AMD under the same circumstances. At least that would have been a tad more believable. I'm not buying it... and (k)assler's inconsistencies aren't helping his credibility.
 
^ I'm only saying that most of AMD's low end has VT compared to no VT in Intel's low-low midrange. The upper stuff is alright tho... but aren't as good clock for clock in high memory benchies. In normal benchies Intels' smart cache slaughters AMD... just thought you'd like to know


+1 Working in VMs with most Intel processors (w/o VT) are awful, but the ones with VM are actually quite alright...
 


You mean the quick test where the very first thing that "kenofstephen" did was put the voltage on the highest value that the AOD software tool would allow. Then he played with the multiplier? (He claimed he didn't have time for doing any fine tuning when he ran this test for an hour.)

I guess we'll have to actually wait until the SB750 is released before we can draw conclusions. Especially since others have got 3.1Ghz stable at stock voltages.





I know you meant the "memory" comment as an insult... but since the link you posted was only done a few days ago... your sarcasm kind of looks like you are reaching.

BTW THANK YOU for supporting my contention and finding a link that validates exactly what I have been saying! You seem to want to "write it off" using the "average gain of 3%". But averages don't work with most of the people on this forum. They won't allow you to just claim an average. They want actual results. If you actually look closer you realize that many of the benchmarks are off by 10% or even 22%. And these are NOT just memory speed benchmarks.

So as I said... benching with DDR2-800 memory is going to give results that are not accurate. If someone wants to accept results that are not accurate... I guess that is their choice. But many people want to actually know the truth. And personally I will discount all reviewers that compromise their results. I can no longer count them as "trusted" or even professional.





Scrutiny of WHAT? You appear to be slightly mentally challenged. Let me spell it out a little more slowly for you.
If a person provides results but does not show the data used to create these results,
and another person claims that it appears that the first persons numbers are not accurate,
then it is up to the FIRST person to show that they are correct.

But in this case since the results were created using extrapolated input data... there is no way to verify the results.

So I ask you: What do you want me to show you? HE made a GUESS and was called on it.
Just because some people will accept his input data without question does not make the results automatically correct.

So let me ask yet AGAIN: What Data or Proof do you want from me? I do NOT need to disprove his results since he can't prove his input parameters. I did NOT claim a result... so I do not need to show you proof of how a result was reached.

BESIDES: You realize that you are attempting to actually change the subject away from the the actual topic don't you? Why does it matter so much to you? Actually I would really like to know because it is petty and supersilious.
 


AFAIK, when you're in a public argument you better be ready to show factual details especially when disproving/discrediting someone. because if that's the way how you think (like you dont need to prove it), you need not post here again, or on any other messageboards, because what you're writing is a whole bunch of rubbish.

any of your opinions, you keep it to yourself at least (or start a messageboard of your own with you as the mod/post/threadstarter), since it concerns you and yourself alone. stop spamming in here will ya?