How does the Intel Q6600 Quadcore hold up to todays cpu's?

Solution
Well I currently use a q6600 and am a sucker for those most demanding games.

My specs:
q6600 @ 3.33ghz
MSI GTX 560 TI twin frozr 2gb
GA-EP35-DS3P Mobo
4gb DDR2 800 mhz ram @740ish due to OC

Crysis 3 is punishing my CPU. This is the first game to do it to the extent where I am thinking "i need to upgrade soon". Of course I realise that every single game I own would benefit well from an upgrade to a 4670K, but with that comes around $500 cost to replace cpu mobo and ram. I've got a pretty sweet custom graphics setting going for crysis 3 that has got me averaging around 40-45 fps, with drops around 30. It looks fantastic. I can barely tell the difference between it and the max preset. Most games have been able to utilize the gpu enough...

breadkun

Honorable
Aug 24, 2013
69
0
10,660
It really depends what you want to do with it.

Its pretty outdated, I used to have a Q6600 on my old old office machine and it had problems when it came to games.
Also emulating ps2.

It would do ok with a gtx 460 for example or something like that. If you would put a 780 with a q6600 it would be a massive bottleneck.

I would look for another CPU if i was you, just because that CPU is pretty outdated (It was good in its prime, trust me.)
 

CPU Boss is unreliable.
 

ACTechy

Distinguished


That's a pretty blanket statement. I don't think the comparison charts are debatable facts. If you're referring to their conclusions, I'm not arguing.
 

qbsinfo

Honorable
Jul 26, 2012
601
0
11,360

they only use synthetic benchmarks no real world situations like encoding a video in handbrake, gaming or even compressing a file.
 

The comparison charts are also highly unreliable. They compare irrelevant things, include incorrect information, and use only synthetic benchmarks.
 

ACTechy

Distinguished


How are clock speed, l2 cache, power consumption, multipliers, and other pertinent architecture details irrelevant? And please do show where there is incorrect information in the link shared.

Like I said, I'm not arguing for their conclusions and point system, but you don't throw the baby out with the bathwater.
 
You can't compare clock speed unless it's the same architecture. 1ghz can beat 2ghz, so throw that out. Even in the cpus you linked to, the i5 has 1mb l2, vs the 8mb c2q. 1mb wins, so let's throw that out. Tdp is not power consumption, misleading info. Multiplier is irrelevant, that is just one part of the equation that equals the cpu speed. Other architecture info is irrelevant, really who cares how many transistors you have? In the end real world performance for the price you're paying is all that matters. Throw the spec sheet out the window. Synthetics can give you an idea of performance but different software has different workloads. Even if it were a multi threaded software, they will perform differently. Let's take ps and pr for example. In the other link posted, the q9550 is faster than an i3 in ps by a good amount but slower, almost equal to it in pr.
 

The clock speed is irrelevant because the amount of work done per clock cycle varies dramatically. L2 cache is not irrelevant, but in this case it's a retarded comparison because the Q6600 unlike the Core i5-3450 has no L3 cache. And the cache bandwidth and latencies can again vary dramatically. I can guarantee that the cache in the Core i5 is superior, but CPU Boss makes it seem like the Q6600 has a huge advantage there. The multiplier is even more irrelevant than the clock speed because what the multiplier is multiplying by can vary. The Core i5-3450 has a 100 MHz BCLK while the Q6600 has a 1066 MHz FSB. Which brings us to one huge difference between the two - the FSB doesn't exist with a Core i5-3450. It's one of the largest steps forward from the Q6600 to the Core i5-3450, yet CPU Boss doesn't even mention it in passing.

But I'm going to be generous here and only say that CPU Boss is utterly terrible and useless.
 

dreadpiratedan

Honorable
Aug 27, 2013
1
0
10,520
Well I currently use a q6600 and am a sucker for those most demanding games.

My specs:
q6600 @ 3.33ghz
MSI GTX 560 TI twin frozr 2gb
GA-EP35-DS3P Mobo
4gb DDR2 800 mhz ram @740ish due to OC

Crysis 3 is punishing my CPU. This is the first game to do it to the extent where I am thinking "i need to upgrade soon". Of course I realise that every single game I own would benefit well from an upgrade to a 4670K, but with that comes around $500 cost to replace cpu mobo and ram. I've got a pretty sweet custom graphics setting going for crysis 3 that has got me averaging around 40-45 fps, with drops around 30. It looks fantastic. I can barely tell the difference between it and the max preset. Most games have been able to utilize the gpu enough to sit on 99% usag nearly constantly, but due to the CPU bottleneck in crysis 3 , it is sitting around 85 at minimum ad bouncing around up to 99 within that range.

Thankfully though, crysis 3 is going to be about as demanding as it gets for the next year or so. Battlefield 4 might be more taxing, by it's only going to be marginally more demanding (if it isn't in fact around the same or less). So I would expect to get at least until mid next year till I needto upgrade. And even then it would solely be so I don't have to play games less that the second highest preset.

2.4 ghz stock to my 3.33 OC doesn't actually kill performance that much either.. sometimes when I'd mess around with my bios I'd forget to put the oc on and end up playing far cry 3 or something on stock cpu.. Not much difference. Maybe 10-15fps..
 
Solution

TRENDING THREADS