Q6600 isn't real quad?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

I think you've mixed up the Monitor Refresh Rate with Frames per second. They are different figures, related but not identical in meaning.
 
I think you've mixed up the Monitor Refresh Rate with Frames per second. They are different figures, related but not identical in meaning.

http://en.wikipedia.org/wiki/Frame_rate
Frame rate, or frame frequency, is the measurement of the frequency (rate) at which an imaging device produces unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems. Frame rate is most often expressed in frames per second (FPS) and in monitors as Hertz (Hz).
 


Virtually by mechanics then.

Word, Playa.
 


True.

However, you are distracting from several different points.

#1 - You can benchmark CPU performance with games at lower resolutions
#2 - At low resolutions the CPU is the bottleneck
#3 - At high resolutions the GPU is the bottleneck
#4 - On desktop machines, in virtually all normal usage situations, INCLUDING MULTITASKING, FSB is NOT a bottleneck

[TC Meltdown]

HOWEVER WE ARE TALKING ABOUT FREAKING BENCHMARKING. GOT IT? SO IF AT A LOW RESOLUTION PROCESSOR A GETS 350 FPS AND PROCESSOR B GETS 250 FPS GUESS WHAT, ONE IS BETTER. DO YOU UNDERSTAND!?!?!?!?!?!?!?

OMG AGAHAHAHAHHAHAHAHAHAHAHAHAH!?!?!?!?!?!

[/TC Meltdown]


BarronKassler. That is your new name. Want to talk about bottlenecks? Your pro-AMD bias is a bottleneck to your understanding of personal computers, processors, FSB, and discreet graphics.
 



No kidding. They keep changing the subject and arguing different points to distract from the fact that AMD, by its own pricing scheme, admits it doesn't have a quad better than Intel's Q6600.
 

http://en.wikipedia.org/wiki/Refresh_rate
The refresh rate (most commonly the "vertical refresh rate", "vertical scan rate" for CRTs) is the number of times in a second that display hardware draws the data it is being given. This is distinct from the measure of frame rate in that the refresh rate includes the repeated drawing of identical frames, while frame rate measures how a video source can feed an entire frame of new data to a display

I'd prefer to conduct this discussion without seeing how many times we can quote the encyclopedia anyone can edit and alter, as clearly it is going to be wrong on some cases as with your first quotation. They do not directly translate across, a maximum refresh rate of 72 Hz does not mean that monitor displays anywhere near that many frames a second. The best way to explain it is through the example of a projector. The slides, the different images, change at a different speed to the pulses of light illuminating those images placed in front of the light source. In terms of the monitor, it's the case of the physical illumination verses the frames loaded into the buffer and the sequenced pattern they follow.
 
I suppose you could even take it a step further and say that your eyes will bottleneck on all resolutions. Most can't distinguish framerates beyond 24fps. :sarcastic:
 

Yes :) But I think some of the participants here would go beserk if I had used that limit 😉
 


Ok, but I am talking about what the user experience in front of the computer. I smooth the computer feels. In normal usage this will not matter because most programs today will run on almost anything that you buy. Most games run well on a reasonable good processor. AMD X2 5000+ will probably be enough for most games. One reason for this are that if you create a game you want to sell it, if there is very few computer that can run the game then you don’t have that many potential customers. What you should look at when it comes to how games are created is the console market.
Buying a quad for games today is quite stupid or not that smart if you can express it like that. When a person buys a quad it probably is for the future. Maybe the computer will be used for three or more years. I think that may be games released on the end of next year that may need a “real” quad to run at is best. There may be games at the end of this year that is played better with quads.
But this is not the main issue, the problem with Intel computers is the slow FSB. It doesn’t matter if the processor runs like hell if the FSB can’t deliver. The processor is going to wait for data most of the time.



 


True, but how many of us use Intel Core-based processors at 133Mhz FSB?

Intel's got the FSB up to something like quad-pumped 800-1066-1333Mhz. It fails in most high-memory applications such as general server work compared to AMD's Barcelona arch... but excels at almost everything else.
 



BaronMatrix: But is it "fast enough"?


We'll I'm talking about measuring the performance of the processor, something AMD fanboys don't like to talk about.
 


I believe that is just is enough for the most demanding games today. Games today aren’t that challenging for the processor. Games today have not been built to take advantage of quad core performance. The market is not there yet. The FSB is therefore enough for now I think, but just wait to the start of next year. Then it could be the limiting factor for performance in the new games.

 



Agreed!

There is only one game that I play that I wish I had a quad-core for, and that is Crysis. The physics calculations bogs down my dual-core Opteron 175.
 


If you are talking about total system then yes. But just CPU a Q6600 is better than Phenom in performance/watt.





Um...not really. I have a Q6600 and trust me it makes minimal difference in FPS to really care that much about. As for the FSB being a limiting factor in games......nah. Crysis is very bandwidth and of course VRAM hungry. But it still tends to not fully utilize a quad core very much.

I think this guy is just annoying. He doesn't like to admit that Phenom falls short. Uses the same FSB argument consitently even though he does not have a C2Q so he does not know. And continually changes the subject of which it is inferior to Phenom.
 
Phenom is ok ... there is simply not sufficient justification to upgrade from a high end X2 to a Phenom when your better off getting a better graphics card for the money.

For most things anyway.

I am sure there are a few benchies stating otherwise but I am not interested.

I'd rather read about the 4850 ... hmmm ... nice 4 the price.

 


What do you have your Physics quality set to? I have to have mine on medium because my CPU can't handle it.
 

I don't think one thread can bottleneck the FSB, crysis is almost a single threaded game. The PCIe is using DMA to access memory so there is data that goes through the PCIe that don't travels in the FSB. The can't design one game that is using the FSB that heavy because then it would be impossible to play it.
 


It has been a while since I played it (sorry I am a TF2 addict really) but I had that set to Very High. I think the only things I did not have set to Very High was water quality and shadows as they don't matter as much as the pixle, shader and physics.

Heck the whole reason I got my Q6600 was because I saw what they could do in HL2: EP2 with multi-cored CPUs and they said quad cores would do better with it. Remember the last lvl in EP2 where the house gets destroyed by the Strider and falls to pieces?

Either way I think the most I have ever seen my Q6600 being used by Crysis was 50% in the first 2 cores and the 3rd and 4th cores were not being used much. Heck TF2 uses more of all 4 than Crysis so far. I see core 1 at about 75% and cores 2-4 at 10-30% during gameplay.

Either way I have never had anything jitter while playing those games. I had my antiviruse running while playing TF2 (it als defraged my HDD and emptied my unneeded files) and had IE7.0, HLSS, HLSW, Steam and a bunch of friends I was chatting with and TF2 never once went below 100FPS.
 
I’ve found this informative Video that explains why AMD is better than Intel’s Ancient FSB.

http://youtube.com/watch?v=oeEqNMD0aKE

As can see from this Video AMD is light years ahead while Intel have lost their way. Nehalem is a K10 clone and it’s clear Intel have seen the error of their ways with Core2Cheeseburger then followed their inspirational leaders AMD . If you want innovation and advances buy AMD…..It’s a no brainier.

Buy Intel then you’ll be left behind! AMD do all the hard work while Intel are Evil and Cheat! Buy AMD!

AMD4Life!

AMD the smarter choice
 
If AMD is the smarter choice, why is their top of the line model only barely able to keep up with intel's bottom of the line, one generation old quad, and only in some benchmarks?

Last I checked, that makes Intel a smart choice, not AMD...

(Oh, and if Nehalem is a K10 clone, why is a Core 2 Quad faster than a K10, and Nehalem faster than a Core 2 Quad? A clone would perform identically...)
 



Check and mate.
 



But people will watch the video you linked and then repeat mantra #547 from the Intel Fanboy handbook:

"But a true quad core with IMC is really only good for servers."


NOTE: The next version of the handbook deprecates that excuse. We should start seeing less of that excuse as Intel gets closer to releasing a desktop version of the Nehalem in about the first half of 2010. We should expect the popular benchmarks to change and focus on the benefits of monolithic design shortly before that time.

EDIT: Oh sorry... my bad. They pulled mantra #326 from The Official Intel Handbook for Fangirls and Fanboys: "But the benchmarks optimized for our FSB show better results". (It had to be one or the other.)