Is This Even Fair? Budget Ivy Bridge Takes On Core 2 Duo And Quad

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

pauldh

Illustrious
[citation][nom]flong777[/nom]And what is more sad is Pauld is backing up your stupidity.[/citation]
Listen flong, I think I made it pretty clear I wasn’t speaking against you or any other points you may have made. Considering I ignored the 2+ pages of arguments between the two of you, how could I choose a side? I guess maybe you are saying... my words, and Tom’s data, back up the opposite viewpoint of yours?

I only jumped in to represent Tom’s Hardware, because adjacent to one of my replies, you twisted Tom’s words to mean the i5-3570K is THE BEST bang for the buck gaming CPU period, ( and pretty much implying, nothing else below is as noteworthy, or would cut it in games for that matter.) If that were the case, that monthly piece would be one page long. ;) Rather the i5-3570K earns a place as one of the best, along with the i5-3350P, i3-3220, and Phenom II X4 965 BE. This is my story, and I work for Tom’s Hardware, so therefore I’m not going to allow that type of misinformation within these comments without correction.

Had you said… Best bang for buck enthusiast-class chip, best bang for buck gaming CPU above $200, that’s fine. We agree with you. And I have not seen that argued against by any reader here. But THE (singular) VERY BEST bang for buck gaming CPU is simply false, and our data would not match that statement at all. You need t look specifically at the most interesting settings, not meaningless low resolutions that taint overall averages.

For the money, I’d argue i3-3220 and i5-3350P would both deliver more bang for the buck in overall gaming. And your massive OC advantage would come at the added cost of an enthusiast chipset and an aftermarket cooler. Would I spend $40 more? Yes probably, but not for every one of my machines. Would I suggest others who do not overclock do the same? Probably not, but it depends. Would I fault anyone for not buying a K-series because they can’t afford it or it robs from other supporting components? Most definitely not! When assuming what is best for you and I, is then applied as a blanket statement for everyone, well to be frank… imo that is where true stupidly comes into play.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


Buddy, it takes two to tango. You could very easily have stopped replying to me if you thought I was wasting everyone's time, but you didn't. So although I will admit that this exchange hasn't been my finest moment, it is hilariously disingenuous of you to act as if you've behaved any better. Stop talking about personalities and behavior; focus on the matter at hand, and you'd get a more polite reception. That has been the problem from the beginning.

Now, to summarize the thread for the last time:

1. It's more than a $40 difference assuming standard retail prices, because one must buy an aftermarket cooler to take advantage of the 3570k's main strength relative to the 3350p. It might be $40 or less if you have access to a short-term or limited-area sale, but we cannot make general recommendations based on those things. That's why Tom's uses standard newegg pricing in all of their recommendations.

2. Let us be abundantly clear: it is you who have tried from the first moment to foist your opinion on others. Consider your original position, which is that it is flat out unjustifiable not to spend the extra money on an overclockable i5 if you can possibly afford it. That is not an inclusive or tolerant position to argue, because by stating it you imply that those who disagree with you are foolish.

Toss in your various statements about how budget builders should "collect cans for a month to buy a 3570k" or "mow a few extra lawns" and it's clear that you aren't an open-tent kinda guy on this issue. From the beginning, you have polluted what should be a fairly straightforward discussion about value and performance with baseless assumptions about people's lifestyles. That sort of talk, whether you realize it or not, encourages people to oppose you, and to oppose you more enthusiastically than they might otherwise.

3. By the same token, by continuing to shout that this is only a $40 issue, and that therefore anyone talking about it is stupid, you are not "respecting others' opinions" as you so piously claim. Quite the opposite.




The irony is thick. And as for whether the 3570k proffers a 20% performance advantage? No one has disputed that. The question is whether that 20% performance advantage costs less than 20% more money, or whether that extra money might yield better performance advantages for the user's intended purpose if it's spent on a different component. Value per dollar -- it's not a difficult concept.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


Unsurprisingly, you say it better than I do. I apologize for the contentious ramble fest. Heh, or I apologize for my half of it, anyway.
 

jesot

Distinguished
Dec 19, 2008
260
0
18,790
I don't know what you guys are talking about. I'm too busy fapping to the gains I'm going to see going from an E8400 and GTX 260 to a 4770K and a GTX 770.
 

flong777

Honorable
Mar 7, 2013
185
0
10,690


Fair enough, thanks for your post. With email type posts there is no visual communication and sometimes it is easy to be misunderstood by all parties. Experts tell us that communication is 80% visual and aural. That's what makes trying to post opinions on TH so difficult. You may say something with a good heart but it may come across wrong to the reader because of the lack of this most important (visual) communication.

Actually posting on Tom's has been of aid to me in the business I run. I am MUCH more sensitive now as to the tone of my email communication with clients and VIPs and subcontractors.

Your post is accurate and it is a good point that the 3350P is roughly equal to the 3570K if you do not overclock. You also took the time to post the TH article. All of this was high quality. I apologize for misunderstanding your posts.
 

flong777

Honorable
Mar 7, 2013
185
0
10,690


You're right I am wasting everyone's time by responding to you - it was a HUGE mistake and I apologize to the community. I intend to remedy this mistake with this post because you are not worth my time and I will not abuse the TH community's time.

Everything you say sans the insults is a dull repetition of what you have already said and you are repeating yourself like a mindless parrot and THAT is wasting everyone's time as we suffer through you repeating yourself.

It's still a $40 issue and you are still wasting everyone's time. I have already addressed your cooler comment (for the 3350P) so to make it again is hmmm, what can I say, a waste of time. I will let the community make make up their own minds whether a $40 issue is going to make or break just about any build except very low budget builds.

I will also let the community judge whether $40 is worth a 20% AVERAGE improvement on a CPU which is equivalent roughly to the performance difference in CPUs between Sandy Bridge and Haswell. So for $40 you jump two generations of CPU evolution in performance if you overclock.

It is my guess that most serious gamers would jump at the chance for this very cheap performance gain. Certainly professional reviewers on several websites strongly recommend it because the 3570K is almost unanimously celebrated as the best gaming CPU for the money by these websites.

Now you are going to say the some of the above comments I make are repeated and yeah, you're right. I felt it important to address your dull, repeated comments one last time for the record because you obviously don't get it. This makes me repeat myself unfortunately. To set the record straight is for the community not for you.





 

flong777

Honorable
Mar 7, 2013
185
0
10,690


Matumishma you already won your point with me I agree with EVERYTHING you say except the choice of the 3350P for a $400 build (won't work with that budget). Can you see me I am crying UNCLE, UNCLE you win, you win :). Actually I complimented your post above because it was very thoughtful.

You brought up and made clear a very important point that some people who want to build a computer are really struggling. They may only have a $400 - $500 budget as you did with your computer ($363 + video card). I totally agree with your post that everyone needs to be sensitive with small budget builds because these builders are cash-strapped and simply cannot afford more expensive components. That someone is on a limited budget is NOT something to find fault with.

BTW I worked on a VERY slow Pentium computer for years while I built my business because I could not afford a better computer. I can only describe it as painful. This computer (I still have it) takes up to 30 seconds to load a web page to give you an idea what slow is. I say this to let you know that I DO feel the pain of people who are trying to build a computer on a limited budget.

Thanks for reminding everyone that any one of us may not have a big budget to build a computer at different times in our lives.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


The only thing worth recording here is that you wagged your tail and conceded the debate when Paul, a person you perceive to have authority, restated my position almost word-for-word -- but then in the very next post you turned back into the uncompromising internet tough guy with me.

What that says about you? You're right; the community can decide.
 

InvalidError

Titan
Moderator

Sometimes it is not even about the budget but needs.

If I really wanted to, I could have set money aside to build a $5 000 dual-socket Xeon workstation. But nothing I do actually requires more than an i3-32xx to run more than well enough for my taste so I settled for an i5-3470 to get my foreseeable future needs covered. By the time I outgrow it for whatever reason (likely due to maxing out RAM), the i5-6570 will likely be out and I will be far more interested in a platform refresh than "regretting" not getting a 3570k.

Another reason I chose to get a clock-locked CPU on h77 (aside from being cheaper) is actually specifically to make sure that I cannot be tempted to overclock just for the heck of it: I have had only bad luck even with mild overclocks in the past and those usually left me hunting down silent data corruption for several days or just wiping the OS and spending days re-installing everything. Not worth the trouble for extra performance I did not even need.

I'll leave overclocking to people who have more time to waste when it (silently) goes wrong than I do.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


Well said.

All of us who've been doing this for more than few years have butted our heads against unavoidable and significant platform changes over the years; for me, the most notable were probably the death of Intel's RDRAM movement and the swap from PCI to PCIe -- and those are perhaps extreme examples, but similar things could happen in the future, and there's no way to say with any certainty whether any amount of extra CPU muscle you buy today will ensure that you won't want to build a new rig at some arbitrary point in the future.

To expand on your excellent example: A 3570k rig with the standard 8GB of DDR3 RAM might look pretty pallid in a few years if new machines are running 16GB of DDR4. And you might not be able to just add a couple of DIMMs to offset your disadvantage, because the price for your now-defunct DDR3 may shoot through the roof due to a decrease in supply. Even if the CPU is still competitive, the platform might not be.

To argue that a 3570k is definitively more "future proof" than every other CPU is to make an unsupportable assertion. It's unsupportable whether the user in question wishes to overclock or not; you are correct when you say that overclocking is not for everyone.
 

hunterswoodfarm

Honorable
May 10, 2013
1
0
10,510
Thanks for this, it sums up what I thought. Upgrading my five your old Core2 Q9650 system would be a wast of time and money now or any time in the near future. No wonder PC sales are collapsing (and it's not just Windows 8). I have the money, I want to spend it, but there is nothing worth buying.

Hmmm maybe I get one of those Samsung Galaxy S4's instead, i'm sure ARM will appreciate the licence fee it gets for an octo core Exynos 5 SoC's.
 

InvalidError

Titan
Moderator

Like me when I decided to ditch my C2D-E8400: I was still fairly happy CPU-wise but needed more RAM and $330 to upgrade it from 8GB to 16GB DDR2 on a motherboard that may not even support 4x4GB DIMMs did not make much sense vs $380 for i5-3470 + h77 + 16GB DDR3 + possibility of upgrading to 32GB RAM later - which I most likely will since 16GB turned out to be a tighter fit than I originally expected.

My CPU needs may be modest but I tend to have a dozen memory hogs open all the time so my memory requirements seem to know no bounds.
 

Matsushima

Honorable
Mar 6, 2013
344
0
10,810


I never said the Core2Quad was obsolete, or anything like that. I have my dualcore Athlon64 II 270 and it can play my games just fine.
 

rbagany

Distinguished
Sep 27, 2011
21
0
18,510
The good ol' Core 2 quads still give decent performance, but I miss a lof of new features from the mobos: SATA 3, USB 3.0, and slots for cheap DDR3 modules - well, some boards support that. Also, DDR2 prices have been a lot higher and those old boards usually support less memory - my old Asus can take only 8GB. The other el cheapo AMD board supports 32GBs.
 

775 boards with DDR3 is readily available. Many people still don't use SATA3/USB3 yet.
 

Matsushima

Honorable
Mar 6, 2013
344
0
10,810
By the way, I would like to see how a heavily overclocked E8400 performs.

I had been using a K8 laptop that would not go above 800MHz(except when there wasn't any programs running) before this, as well as getting so hot as to create a mark on my table. The Lynnfield has a bad motherboard and takes 5-10 minutes to boot. So for me, I had to do something urgent and had about $400 to spend at that time. I should/would have bought the 3570K otherwise and would verily recommend it to anyone who can afford more than a H77 board and a 3350P.
 

Matsushima

Honorable
Mar 6, 2013
344
0
10,810

There are even 775 socket boards with USB3.
 

mlongbsa

Distinguished
Sep 23, 2009
2
0
18,510
Great article for us still using the old architecture and wanting to upgrade but can't justify upgrading until our systems die.
 

InvalidError

Titan
Moderator

As geeks, most of us have a hard time resisting the temptation of upgrading but in recent years, the cold hard facts indicate this is becoming unnecessary for a growing number of us.

For many people, convenience is becoming far more important than raw power and that is when the Tablet-PC concept (something that actually started over a decade ago) will really take off. If that ball does not start rolling with Haswell (largely due to Intel still being stuck up on inflated price points and Ultrabook marketing), I have no doubt it will with Broadwell next year.
 

qu4k3r

Distinguished
Mar 27, 2011
4
0
18,510
Very pleased to read this article. I'm also running Q6600@3.2GHz (G0) and it still can handles everything I throw to it. Now I know what to expect more or less from the FX-6300 I've choosen for replacing my C2Q. Can't wait to read the results of this crop Intel cpus against an AMD selection.-
 

straight-six

Honorable
May 11, 2013
1
0
10,510
Thank you for a very good article, I've been researching Pentiums and Celerons and it's really hard to find any info or benchmarks. I will disagree with one point, however:

"In fact, most of the time, it's pretty difficult to get samples to review from companies like Intel and AMD, which don't want to see their lower-end hardware maligned."

I believe the true reason behind the scarce number of low-end CPUs submitted for reviews is the fact that Intel and AMD don't want people to realize how powerfull and cost-efficient they are. Intel makes it's money in high-end CPU's and if people start buying $35 Celerons because it's all they need for basic computing, Intel loses a chunk of money.

i7 is great if your doing video editing and i5 is king for gaming, but for "normal" usage which covers maybe 80-90% of users you only really need a lowly Celeron or Pentium.
 

InvalidError

Titan
Moderator

The Pentium and Celerons are fairly decent for relatively light everyday workloads and occasional heavier tasks but I think the medium/long-term best bang-per-buck on Intel's side for normal people who want to keep their PCs for 4+ years is the i3.

While few programs make much use of explicit threading to make i3's HT shine vs Celeron/Pentium, application frameworks, libraries, APIs, drivers, etc. are progressively delegating more work to worker threads and that delivers some threading benefits even to single-threaded software on CPUs that support more hardware threads in whatever form.

As this becomes more common, the i3's advantage vs plain dual-cores will grow while the gap vs i5 may actually shrink: enough background work to give HT a workout but not enough for i5 to pull away.
 

RobJordanIsCool

Distinguished
Dec 4, 2010
5
0
18,510
As someone who just upgraded a few months ago from an E8400 to a 3570k this was great read. Nice to be able to clearly see exactly what my money got me by comparison!
 

flong777

Honorable
Mar 7, 2013
185
0
10,690


You need to re-read my reply to Paul. My response to him is based on my misinterpreting his motive, not any perception of authority. While I may not agree with everything Paul has said, he has made some very good points and his posts have been to point and intelligent. Hmmm I can't say that about some people's posts here.

I was wrong about what Paul had posted and my response simply stated that. But now that you bring it up, people in authority generally should be respected - but this case Paul has earned my respect and it wasn't his title.
 
Status
Not open for further replies.