Is This Even Fair? Budget Ivy Bridge Takes On Core 2 Duo And Quad

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


No that is not what we were talking about although I agree with the statement. We never discussed whether you need to OC to play games or not. Dude are you on something?
 
[citation][nom]flong777[/nom]Here's the conclusion to that article:http://www.tomshardware.com/review [...] 41-14.htmlIt shows that the overclocked 3570K had an average of 20% better performance than the overclocked 3350P. That is a big performance gain for $40 and I would take that choice in real life every time. A 20% increase in computer performance is the difference between Sandy Bridge and Haswell for my CPU (the 2600K). It is a huge difference. It is $40 well spent IMHO.[/citation]
Yes.... in overall performance, which was the goal.

But it seems you were talking games, in which case, more money would first go into graphics. The $600 PC bumped to Tahiti LE would game just as well as the $800 PC. Next you'd want 8GB RAM, not the i5-3570K. Eventually, once supporting hardware is in check, the 3570K is awesome, and could easily extend the life of the gaming platform through multiple GPU upgrades. I can guarantee you Don (the $800 builder) would quickly shed the 3570K in a pure gaming system facing budgetary constraints. It would be the first thing to go if he was lowering the budget. The 3570K was picked to gain an advantage in applications (60% weighting) once overclocked.

In short, the 3570K isn't the best bang for buck gaming CPU, rather the best bang for buck gaming CPU over $200. It's the goal for enthusiasts/gamers to shoot for if you can still afford the desired supporting platform.

edit - BTW, I haven't' read most of your discussions, so don't read into this as speaking against you or your other points.

 


I haven't made any statements about my personal situation or my build habits at all, thanks. I'm talking about value for money. When I recommend a build to someone on the forum here, for example, I have to evaluate their needs, their budget, and so on.

This isn't about me, and it isn't about you. This is about making sensible recommendations to system builders of all stripes and on all kinds of different budgets.



For the last time, I'm not accusing you of "putting down" people who use their computers for web surfing. I'm telling you that when you continuously disclaim that you're not insulting users on the most extreme low end of the spectrum in a discussion comparing two relatively powerful CPUs, you are (perhaps unintentionally) implying that only your CPU of choice is suitable for anyone else. Context.

You do not need an overclocked i5 to play games, and to play them very well. Period. All of your rhetoric about people with $400 computers, mowing lawns for a living, etc, is irrelevant.



Are you sure you want to go down this road? Really? Cause I could say a lot of things about what I suspect is true of you, based on what you've said in this thread. Let's start with the least offensive: you seem like someone who only just built his first computer two years ago on the advice of his scantly informed and hype-prone l33t gamer buddies.

This is why ad hominems are bad, by the way. They don't lead anywhere constructive or even interesting.
 


OMG do you ever stop talking?

Someone else wrote a very thoughtful post in this thread where they were saying that not everyone can afford more expensive CPU coolers and PSUs. In my posts I am referring to their post which I agree with. Some people don't have a lot of money for their build.

For that last time, by saying the 3570K is the best GAMING CPU for the buck IF you are going to OC this is not inferring anything to anyone who chooses a different CPU. You are just irrational. If I wanted to waste more time on this useless discussion I could list at least three different professional reviews of CPUs that have said the same thing and the professional reviewers aren't putting anyone down.

And again with the ad hominem statement, I'm not sure you understand what an ad hominem attack is because I have patiently tried to answer your concerns over and over an over again.

There is no value of continuing this discussion for a $40 issue it has been a HUGE waste of everyone's time.
 


I'd say you've pretty much nailed the whole discussion. Flong has argued from the beginning that is flatly unjustifiable not to spend the extra money on a 3570k if you can possibly afford to do so, which of course is true if all else is equal, but that extra money can always be spent elsewhere, and in many cases elsewhere is the correct choice.
 


Have you talked less than I have in this thread? Physician, heal thyself.



The disclaimer about overclocking is new; it was not part of your original argument. Your original argument was that anyone who's not planning to buy an unlocked i5 for their new system is foolish not to save up the extra money to spring for the 3570k. You then disclaimed that you were not discussing people on the extremest of extreme low-end budgets. (~$400 total)

All of which leaves the reader to conclude that you believe there is literally no good justification for buying a locked-multiplier i5. "It's only $40 extra for the 3570k!!! Go collect cans for the extra money if you have to! It's worth it!"

You've wriggled a lot since then, but that's the essence of your position. And it's wrong. The 3570k is the best bang-for-buck CPU given a ~$250 CPU budget. Everyone's agreed with that from the first moment; your pointing to that uncontroversial statement now, as if it was all you ever intended to argue, seems disingenuous.

Given a lower CPU budget, there are cheaper CPUs on the market that offer equivalent or superior value, or if you prefer, cost-efficiency. One wouldn't think that that's a controversial statement, and yet you've been quibbling with it vociferously for the last several pages, all the while complaining that you were wasting your time.
 


Or if you're 4+ hours away from one, get your brother-in-law to buy it for you. Got my i3 for $89 last summer. Getting him to pick up the i5-3570K for me at Xmas.

And enough of this vociferous quibbling!!! :lol:
 


Luckily, there's one on the way home from work. The price on their i7's ($230!) is why I'm going 4770K over 4670K next month.
 
For myself and friends, I've built Q9650's (E0 stepping) OC'd @ 4.2, and Q6600's (G0 stepping) OC'd @ 3.6. One of each system continues to run perfectly within Vcore and temperature specs. Both have had SSD and GPU upgrades.

Since these were popular overclocks back in their day, I would like to have seen such a comparison. Nevertheless, there was enough data in the article to interpolate how those overclocks would perform against the 3570K.

I think my friend's Q9650 @ 4.2 is still a very effective gaming platform.
 


Absolutely.
 

If every retrospect type article had to include every "popular" model ever made over the period of interest, there would be no end to it.

When the main objective is to show how much (or little) some things have changed between two given snapshot periods, I think two or three representative samples from each period are more than sufficient to get the general idea.
 
I think I may have asked the wrong question last time considering the high level of tech proficiency in this forum currently.

Is a Q6600 G0 running 3.6ghz @1600 fsb into 8 gigs of ddr3-1600 cl9 on an EVGA 790i ultra sli mb worth upgrading the graphics on from 2x gtx 260s in sli?

If so what graphics cards level would I start seeing bottle necks from the cpu/mb at?
 

I responded to your question a few posts above, you can PM me if you want some other recomendations so the thread does not needlesly fill up.
 
With these lackluster improvements over the last few years it's no wonder PC sales are soft. It's been 5 years since the mid range e8400 has been released and today's mid range CPU is about two times faster for most uses. It used to be that if you waited five years for a new computer the upgrade would be monstrous.

Just think of the differences between GPU's or SSD drives from 5 years ago to those of today.
 




You are right. I built mine for $363 but I'm happy with it so I'll save up, and buy a Haswell/Broadwell K model in the future. Or I could upgrade my i5 750 system. But I'll see if I need an upgrade, because even in 2-3 years I may be still running Windows 7 and MS Office 2007 and playing only TF2. And my Lynnfield from 4 years ago is still not showing its age.
 


I'll repeat it again - you're a complete waste of everyone's time that reads this thread. First you argued non-stop with someone who disagreed with you then you can't shut your mouth now.

This is a $40 issue and you can't shut up. And what is more sad is Pauld is backing up your stupidity.

Frankly, my original opinion still stands. In no circumstance would I recommend the 3350P over the 3570K for a gamer. EVEN IF, they don't plan to overclock right now, the extra $40 buys you the flexibility to upgrade your computer's performance by an average of 20% (and that is an average, separate gains will be more in many cases) which is a HUGE advantage over the 3350P and it doesn't cost a dime more than $40. People who game grow and their power needs grow and change - they may want to overclock later as more and more CPU intensive games come out.

If you and Pauld disagree, fine, that's your right. But don't try to shove your choice off on me or the many others on this thread who would spend a whole $40 for a 20% performance improvement!

A 20% increase is the difference in CPU performance between Sandy Bridge and Haswell. So spend $40 now and you jump two generations in average CPU performance. That IS a huge advantage of the 3570K and it is a no-brainer for any serious gamer.

If a builder can't afford a more expensive CPU and has a $400 budget then they should be looking at a $120 CPU anyway and not the 3350P.

If this were even a $100 discussion then it might be worth having but it's not and this discussion is of no value to the community.
 
The fact is,(I'll repeat this again) applications are becoming multithreaded and quadcores will probably become obsolete in the next 3-5 years, just like the dualcores such as the E8400 became obsolete nowadays. However if you happen to have a Coolermaster Hyper 212 or NH-D14 sitting on top of your 3570K four years later, go ahead and overclock the hell out of it, and feel good about yourself for spending the extra $40. I would save up for a 3770K if it is not urgent. I would prefer not to overclock, so I would have no problem with buying a locked multiplier processor. Not to mention that the 3350P was featured in Best Gaming CPUs for the Money in Tom's Hardware. Apart from people being rude and having flamewars, the discussion is of value, because the 3350P simply has a locked multiplier and a slightly lower clock. I am NOT attempting to change anyone's opinions on this(no bashing/flaming please) but we do not know for sure if the performance of an overclocked quadcore today can match the performance of a processor with 8, 10, 16 or more cores 3-5 years later in applications which by then will take advantage of more and more threads. Also, flong777 said that if a builder has a $400 budget then they should not consider the 3570K and he/she's right. Try to cram a $200 processor plus a cooler and an overclocking motherboard (otherwise there is no point for unlocked) into a $400 budget. I would, however, consider the 3350P if to be used with a H77/B75 motherboard and a low/midrange graphics card such as a 7850 or a GTX660. I'm sorry, that's only my opinion and yours may differ, and I'm honestly not forcing anyone to change their opinion and this discussion should be of value if the flamewars stopped. Otherwise I am wasting my time debating this.
 
Yes, the 3570K is the new E8400. But like the E8400, it too will become obsolete as programs begin to have more and more threads. Now about the article, I would have liked FX/Trinity vs AM2 Phenom/Athlon64 and Lynnfield/Clarkdale vs Ivy Bridge Core i3/i5. I really would like to see how my i5-750 stacks up against an i3-3220. Also helpful would be Intel's Graphics Media Decelerator series/1st gen HD Graphics series vs Ivy Bridge HD Graphics. Not that it would matter much to gamers anyway. I also discovered that my HD2600XT (installed in Lynnfield rig) was on the same tier as Intel's HD4000.
 


Let me stop you and Im not flaming..... this multithreading you speak of still hasn't happened to the extent it should and all our old core 2 quads are doing just fine because the multithreading that was all the talk 5 years ago still hasn't happened. In another 5 years maybe all cores/threads can be used but currently there is very little of this happening, the scheduling in apps and windows hasn't advanced even the 35% that processing power has on the quad core cpu's

I'm sitting on my core 2 quad machine until I can find another 570 for sli, the games don't need more threads they want more single core performance.... for now.

I like the idea of multithreaded apps but it hasn't been happening as it was sold to us core 2Q users 4 or 5 years ago, which is of course is not intels fault. but the programers frankly didn't/don't ususaly require more processing power than what they already had available and I suppose the apps that came out were not really designed with anything more than 2 threads
 

Windows is perfectly capable of scheduling multiple threads as shown by benchmarks of the i7-39xx running heavily threaded workloads.

But there is no point of worrying about Windows' scheduling capabilities when apps and games are for the most part still fundamentally single-threaded so Windows has nothing to schedule on other cores most of the time regardless of how good/bad it may be at it.

Since Windows does get used on multi-socket workstations and servers, it has to be at least somewhat good at scheduling otherwise it wouldn't be worth spending thousands of dollars on dual/quad-socket Xeon CPUs for heavily threaded workloads.
 


If what you are saying is COMPLETELY true the BULLDOZER would have crushed intels hyperthreaded options in the same price range, AMD effectively has made hardware hyperthreading.....which I think is pretty awesome but for some reason the amount of threading that windows can do is not what you think, those multi core systems you speaking of show good results not because of windows being able to schedule and handle multiple cores well but they do it adequately, the saving grace is the pure brute force processing power provided by those workstations.

From what I've seen anyway...... trust me I don't take this to seriously it is simply a matter of opinion from what I've seen and experienced and should in no way be considered fact,
BUT at the same time
Do you remember MS promising better threading in win 8 vs 7 after BD really pooped the bed? .... that really never happened at all and you must be able to at least agree with that.....
 

If by 'they' you mean applications, you seem to have a weird understanding of how scheduling works. Applications run in user-space and at best, they can set thread properties to give the OS scheduler some hints of about how they should be scheduled but then that requires that the application be aware of the underlying CPU's specifics. The actual scheduling decisions however occur in kernel-space and applications have very little control over that.

Part of the reason why so much multi-threaded code behaves poorly on AMD's modular CPUs could simply be that a lot of it is compiled with Intel-optimized tool chain and libraries. No amount of messing with the OS' scheduler or trying to make the OS itself more heavily threaded can fix that.
 


By they I mean MS
 
 
Status
Not open for further replies.