Can Water push Yorkfield to 5 GHz?

virtualban

Distinguished
Feb 16, 2007
1,232
0
19,280
0
Multitasking anyone? What happened to multitask testing? How does a given system compare to another if certain applications might be running on their fullest in the background? Let's say DivX and XviD do not seem that good in multithread. So give them a single thread to work with, and play a game at the same time. All those measurements of power efficiency and yet no efficient usage of time really benchmarked somehow. Aren't software always improving on user interaction, especially on the "without" user interaction part, like a batch job. So, give the 4 system 4 consecutive video conversions, or 4 parallel ones. Apart from the first being harder to do, software advancements or not, I believe the second option works best. Still, seeing it benchmarked would help. Like cache usage and cache pollution, bus bottleneck, and of course HDD, but many do have different HDDs for different usages, so a game and a winrar compression should be only bottlenecked from cache and bus.
 

prolfe

Distinguished
Jan 9, 2005
251
0
18,780
0
This was a great article showing what OC'ing we can expect from the high-end of the yorkfield series (at least the current stepping). In fact, I have been pretty much convinced to go with a Wolfdale in my system being used 99% for gaming. I have been wondering how some other, more current games, might respond to the quad-core. Are there any plans to bench Crysis, SupCom, Bioshock etc on the quad-core?
 

Kamrooz

Distinguished
Feb 8, 2007
1,002
0
19,280
0
The article is done and overworth, don't expect any updates.

The multiplier will be holding back the midrange yorkfield quads, If it did indeed have a 9x multiplier, we would probably be able to get near 5 ghz, probably around 4.5-5...but that 8x multiplier. Considering I want penryn for power reasons as well as SSE4, it's not deterring me...If I can atleast hi 8x450 I'll be happy. 3.6 ghz will be fine for me. If I can go farther, even better, but 3.6 ghz will be my happy mark.
 

MrCommunistGen

Distinguished
Jun 7, 2005
1,042
0
19,310
8
Why did they get rid of the h264 encoding test when it multithreads better than both xvid and divx? Multitasking should also be a given these days. Power consumption charts would be nice.

-mcg
 

gamebro

Distinguished
Mar 10, 2007
239
0
18,680
0
OK, this is getting very annoying!! You bench a next gen chip with way last gen games!

Call of Duty 2??!?!?!
QUAKE 4!?!!
F.E.A.R. !?


What part of CRYSIS and USES 4 CORES do you fellows at toms not understand? Are you bias against the newer games?

I have a suggestion for your next benchmarks, on say the 9800GTX cards from NVIDIA....... Just bench them on games like QUAKE 2 and Unreal 1.... Or maybe some even older, and now totally irrelevant titles! The games you selected are a bad match for this new technology!

 

shadowmaster625

Distinguished
Mar 27, 2007
352
0
18,780
0
Conclusion: these systems are so rediculously outrageously expensive that there is no point in even looking at them. DDR3? Why? Because you like to waste money and help push intel one step closer to the edge of that cliff where they pull the whole market into DDR3 and everybody gets screwed... again? Why dont you guys ever learn? Screw DDR3. STOP USING IT. STOP REVIEWING IT. (Or admit that you are paid intel hax!) Pretend it doesnt exist. My 3GHz Q6600, motherboard, and my 2gigs of cheap DDR2 costed less than 4Gigs of DDR3 currently cost. lol. And it performs on par with all these systems reviewed here. Go look at newegg. No one is buying DDR3.

Damn i hate to sound conspiratorial but when you morons keep pushing this crap on us and no one is buying it and no one is using it and yet you keep pushing it... what other conclusion can one make other than: Tom's Hardware is bought and paid for by Intel?
 

Crashman

Polypheme
Former Staff


Tom's Hardware is updating its benchmark set one lab at a time. Expect to see everyone on the "same page" with newer benchmarks in the not-too-distant future.
 

yadge

Distinguished
Mar 26, 2007
443
0
18,790
2


First of all, DDR3 isn't intel. I don't think Intel would really benefit from everyone moving to DDR3.

And that's not the point of using DDR3. Of course it's overpriced. Of course no one is buying it right now. But that doesn't matter. When doing a scientific experiment, all things that may affect results must be controlled. You only change the one variable that you are testing. The DDR3 makes sure that there will be no problem with with the RAM not being fast enough when trying to overclock. This ensures that the reults will not be skewed.

Tom's Hardware is not neccasarily endorsing DDR3. They are not "pushing this crap on us". They are just doing their job.
 

cb62fcni

Distinguished
Jul 15, 2006
921
0
18,980
0


Wow. Seems someone can't afford DDR3. It's called progress, get used to it. Go rant elsewhere.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
0

Same here. It was a decent article, but the title should definitely be changed.
 

babybudha

Distinguished
Jul 17, 2006
257
0
18,780
0
ya, Intel has nothing to do with DDR3. The only reason why they use it is because for extreme overclocks with high FSB, you need fast RAM to keep up with FSB. I mean you could probably run DDR2 with a extreme overclock, but they would be way async, and slow memory would possibly bottleneck performance (though I don't think it would with Intel's architecture).
 

Crashman

Polypheme
Former Staff



Sync for a 600MHz FSB clock is DDR2-1200. Ballistix DDR2-800 will run that at CAS 5. All the memory in this article was run at 1.5x FSB clock, which has become a common setting for slower FSB's and has been extended upward by using DDR3.
 

miahallen

Distinguished
Oct 2, 2002
572
0
18,990
1
A buddy of mine bought a QX9650 a couple weeks ago, he's stable at 4.5GHz with a Ultra 120 Extreme. With the power consumption comparisons I've seen between these two chips, I think there is something wrong with the QX9770.
 

MrCommunistGen

Distinguished
Jun 7, 2005
1,042
0
19,310
8

Glad to hear it. I hope that h264 is brought back because I think its more relevant in general and is certainly more relevant for me since its the only video codec I use. Seeing how processors scale with h264 is one of my primary concerns for a CPU upgrade (whenever that decides to happen :??:)

-mcg
 

ASK THE COMMUNITY