Socket 1366 obsolete, SMT a 'gimmick'

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Just a quick question, did anyone think that tri-channel memory would really make a difference? Since when did you read a RAM comparison that didn't show that any halfway decent RAM does just fine in most cases. The problem is that dual-channel RAM is plenty for even a fast RAID 0 hard drive arrays. Maybe with some nice SSDs the extra memory speed might be relevant.
 
Well, I've seen that the prices on X58 boards have been coming down some and you can always put dual channel ram in an i7 (though now tri isn't that much of a premium). All that is missing is a.) overclock the CPU (don't need no stinking turbo!) or b.) the next gen on i7 9xx has a better turbo and then 1366 will be equal or better in all apps at approx the same price. 1366 is the top of the line system, it was never meant to be the best value. That is what i5, AMD, etc. are meant to be.
 
The cheapest X58 is a crappy $160 motherboard and the i7 920 is still $290 online. That is already a $110-$130 difference depending on whether you need hyperthreading or not. I would say that is pretty significant, especially since the LGA 1156 is supposed to still be getting 32nm quads, just not 6-cores. Not only that but when those come out it will be time to upgrade motherboards regardless of whether you have an LGA 1156 or LGA 1366 motherboard.
 
Well, I could have gotten the i7 920 for $200, yet i still got the i5, so I agree with you completely for my applications. All i meant is calling the high end one year old socket obsolete is a little foolish.
 


Yeah, its definitely not obsolete, or well any more so than LGA 1156, but Intel has really made it difficult to spend the extra on LGA 1366.
 
Why does the Cinebench R10 benchmark say quad threaded? Did they run with SMT disabled? This doesn't prove SMT to be a gimmick at all. I get higher Cinebench scores at 2.66GHz than they do at 2.93GHz (assuming the 975 is at stock). They mentioned proving it a gimmick at the end, but they must be talking about another article. This one has nothing to do with SMT.
 
You'd think they would link to the previous article, since it would increase hits.

EDIT: This might be what they meant: http://www.fudzilla.com/content/view/15504/40/1/7/

I wish Fuddo would work on their graphs and legends, they require some deciphering and that's not a good thing.
 
Oh, that's what it means?

I gave up on deciphering it after about 15 seconds - if someone isn't going to bother to make their graphs readable, I won't bother to read them.
 


SMT is not a gimmick or AMD wouldn't be looking into its own version. They just suck at reviews. A lot of people here who own Core i7s, even a guy at work says that they are just super fast. My friend at work said it took 10 minutes to install Windows 7 with his Core i7 and 12GB of DDR3. Maybe not fully but probably the install time for copying files.

As for tri channel, its only a gimmick until programs start to use more than two channels. Just like the first quad core (the Q6600 was the first) was a gimmick until Windows Vista/7 utilized it better and games started to. Back in 2007 people always said "get a dual, OC it quad is useless". Now they say get a quad. Once the third channel starts to get used it will make a huge difference. But for now its mainly server apps and highly memory bandwidth sensative apps.

As for 1366, its not dead or obsolete. Only to those who are Enthusiasts of our kins, well most of us who budget lower. But a lot of people will pick it up for its nice mobos with all the features. Always has. Hell its even set for 6 core so obsolete, no. For mainstream? Never has been meant for it.
 
The Fudzilla conclusion should have come with a giant asterisk. LGA1366 is designed as a server socket and seats more server-oriented CPUs, while LGA1156 is a desktop socket that fits CPUs that are more oriented to desktop usage.

1. Triple-channel memory: server sockets generally maintain compatibility with newer CPUs much longer than desktop sockets do and servers tend to have more CPU cores in them than desktops. The triple-channel memory may be overkill today, but it wouldn't be for a future 8 or 12-core CPU. On the desktop side, Intel would just make a new socket to replace LGA1156 that supported two channels of DDR4 if they needed more bandwidth. Servers also tend to handle more memory I/O-intensive loads than desktops do, so the triple-channel memory would be better-utilized in servers than in desktops.

2. HyperThreading: this will benefit servers that run heavily-threaded tasks more than your average game, although some desktop apps do take decent advantage of HyperThreading. HyperThreading provides somewhat of a performance boost, although it's not anywhere near as good as more actual CPU cores.

3. The thermal and Turbo Boost characteristics of the CPUs is also a bit different. The Bloomfields have a relatively limited amount of Turbo Boost under single-threaded loads, probably because they're designed as server CPUs and are designed to be under a heavy, multi-threaded constant load. Having a single core being able to greatly boost its speed wouldn't be all that useful in a server, for the most part. The Lynnfields have a ton of Turbo Boost for single-threaded tasks as quite a few desktop applications are single-threaded and Turbo Boost would be able to give them a decent performance bump.

Somebody brought up AMD and SMT/HyperThreading. AMD's method of multi-threading in the Bulldozer is not SMT. CPU cores with SMT have one core's worth of execution resources (fetch unit, decoder, integer unit, FPU) but execute multiple threads on that one core's worth of execution resources. AMD's Bulldozer modules have two integer units, both handling their own exclusive thread; the fetch unit, decoder, and FPU is shared between the two integer units. This is not SMT as there are some independent hardware execution resources for each specific thread, namely the integer units. The Bulldozer's multithreading approach looks a lot more like the interleaved multithreading approach that the UltraSPARC T2 "Rock" uses than anything using SMT. AMD calls their approach CMT, but this appears to be their own term and not an already-defined concept like SMT.
 
I would. So your saying Intel needed to design s1366 cpu's, a new memory architecture and sell $300+ motherboards just so people could have 16x/16x. Cmon, that could have been done on s1156 simply with a different chipset, not a whole new socket and memory design. They purposely pushed s1366 on people as the "NEW" thing and then screwed them by coming out with something just as fast, with mobo's 1/3 the price, but slightly inferior graphically. Since 95% of people dont utilize 16x/16x, 95% of them were mislead and screwed over.

AMD does 16x/16x on the same socket with simply a different chipset.

LOL... there is no "chipset" dictating PCIx lanes on the 1156. it is integrated into the CPU so your stuck. i think the i5 buyers got took worse than the i7 buyers to be honest. anyone buying an i7 for gaming and bought a single card didn't do their homework so no pity from me there. They didn't "hide" i5 from anyone nor claim that i7 was the well rounded mainstream product. They actually stated the opposite and even tagged it with a very small piece of the market from the onset. JDJ showered this place with prelim specs on the i5 and what it was meant to be so even if a person didn't venture outside of the TH forums for their tech fix, it was spoon fed to you.
 
Consensus is, multi card, 1366, single i5. For gaming.
Youre getting more for single card solutions going i5 for gaming as the main usage
Thats from owners/users
My old thread was fair, and the i7 users truly havnt/werent upset.
Maybe a few, but not anything for Intel to worry about
 


Problem: Not even dual channel gets utilized for desktop apps.
 
Yeh, the only main advantage the 1366 has for gaming is the full x16 PCIe 2.0 channels.

If you're a high end gamer with 2x5870s, running the two cards in PCIe2.0 x8 results in significant bottlenecks - 5970s are a definite no-no...a total waste of potential.
 
I would. So your saying Intel needed to design s1366 cpu's, a new memory architecture and sell $300+ motherboards just so people could have 16x/16x. Cmon, that could have been done on s1156 simply with a different chipset, not a whole new socket and memory design. They purposely pushed s1366 on people as the "NEW" thing and then screwed them by coming out with something just as fast, with mobo's 1/3 the price, but slightly inferior graphically. Since 95% of people dont utilize 16x/16x, 95% of them were mislead and screwed over.

AMD does 16x/16x on the same socket with simply a different chipset.

Skt 1366 and 1156 are the successors to Skt 771 and Skt 775. Intel did not force anything on nor did they screw the consumer. They did the same thing that they (and AMD) have done before with introducing new tech. If you remember, Skt 775 still used a northbridge/southbridge configuration whereas Skt 1366/1156 moved the memory controller on die. Fact is a new socket and chipset were required to achieve the performance gains that an IMC offers. And, by their very design, i5 and the P55 is a lesser performing processor and chipset that i7 and X58; with some examples being DMI instead of QPI, dual channel instead of triple channel memory, disabled hyperthreading for certain i5 models, and 8x limits on PCIe gpu lanes. So, really, the argument of i7 buyers getting screwed is more your opinion and perception than it is fact.

 

Why does it always come back to gaming? If you're a gamer, you really don't need a server/workstation platform. 8 lanes of PCIe 2.0 is only 5% slower than with 16 lanes using 5870s.
 
5890s and Fermi are coming tho, so it may make even more of a difference.

Anytime first time adopters buy early, they get stung, and then if the under perf chips come later, the higher ones have had time to come down overall in price, thus making the under chips to be reasonably priced from the get go, with just small early price elevations
 


First time ever!!!! I agree 100% with this statement.
 
It's a little bit different in this case though. Did anyone who actually bought an i7 desktop actually think they were buying something that was in reality no better than a top end C2Q or even the Phenom II 940?

I think the consensus was that they were getting something that was a lot better. I mean sure you can pick and choose benchmarks that prove how wonderful i7 is but in a desktop environment it isn't valid.

So now you have a setup where the early adopters paid a lot more and are now in some ways being overtaken by the mid-range less than a year later. A lot less actually what is it? 8-9 months or so?

If you had read this review - flawed as it is - in January...would you have paid what you paid for an i7? It's worth thinking about.

And lets not forget about the D0 stepping 920 which was supposedly a huge increase in overclockability yet couldn't get the i5's improved turbo a couple of months later?
 
Status
Not open for further replies.