Intel Sampling Nehalem-successor ''Sandy Bridge''

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
haha 1156 AND 1366 are screwed here. wow those sockets died fast... good thing i decided to go with a c2q and not move to the new iX cpus
 
"Initial Sandy Bridge chips will feature dual and quad core configurations before stepping into more complex chips with hexacore or octacore designs. The new chips will run on the LGA1155 Intel 6-Series platform codenamed Cougar Point."
I'm sure they mean 1156 which is the current i3 i5 i7 midrange socket.
And the complex hexacore and octacore designs will be 1366 Socket.
 
I just hope they fix the PCI-e problem that P55 MB's had with limited bandwidth, so you know, people could 3-way xfire or sli, because we all know the cpu is going to the backburner when it comes to gaming.
 
[citation][nom]shiftstealth[/nom]"Initial Sandy Bridge chips will feature dual and quad core configurations before stepping into more complex chips with hexacore or octacore designs. The new chips will run on the LGA1155 Intel 6-Series platform codenamed Cougar Point."I'm sure they mean 1156 which is the current i3 i5 i7 midrange socket.And the complex hexacore and octacore designs will be 1366 Socket.[/citation]

No, Sandy Bridge requires a new socket, the LGA1155 or the LGA 1356.
 
I got stuck on the tail end of a product cycle with my motherboard because of the chipset and really don't have anything useful to upgrade to from my Q6600 with out buying a new board and ram....

I think it is time to go with what ever the main stream socket will be for the new hexcore chips and at least have the option to upgrade to a newer hexcore chip with sandybridge
 
I was interested until I read:
Sandy Bridge will also integrate Intel's sixth-generation graphics core and will include acceleration for floating point, video, and processor intensive software most often found in media applications.
I understand Workstations and other forms of PC's may not need an add in GPU and will get by absolutely fine with the integrated graphics. But everyone else it just doesn't make financial sense as you are paying for things yo have no intention of using... ever!
I really hope they decide to go the way of the i5 line and included CPU's that do not include the integrated graphics.
 
This doesn't seem to be so impressive. It's WAY less of an improvement compared to the Conroe/Yonah, and considerably less than the switch to Nehalem from Penryn. Either Intel is sandbagging a bit, which is understandable, or they're slowing down. If it's the latter, AMD might have an opportunity, even though it seems good enough to the rubbish AMD is making now. Remember, the lousy K8? It was small improvement but seemed good enough because the Pentium 4 sucked, but, then the bell tolled for AMD, and the Conroe made them runt off into the low end again.

Intel's been showing signs of laying off the pedal a bit. The brain-damaged Lynnfield, followed by the even more brain-damaged 32nm processor with the memory controller on the GPU! Then, we got the Atom with the memory controller on the processor package, connected through a slow bus to the processor, so it's barely any faster than before.

It seems Intel is listening to the bean counters too much and selling compromised products lately. It's kind of odd that everything since LGA 1366 has been lobotimized more and more. Normally, you'd expect progress, not backward platform steps. What's with USB 3.0 anyway? Rumor is that's not going to be on the next chipsets either.

Intel might be making money, just like AMD did for a little while, but they're not moving forward with technology as fast as they were. Sandy Bridge doesn't seem to be a major step forward based on what we're being told so far.
 
seriously dissapointed about the new sockets, like many other people i got a 920 and an x58 mobo hoping that i could upgrade to an 8 core in the future

glad to see intel rolling out new products but when the motherboards that support these things are costing $200-$300 they really need to give the sockets a rest
 
[citation][nom]zingam[/nom]Perhaps you are too young? My personal experience is that a new CPU needs new MoBo and new RAM - always! Partial upgrades suck.[/citation]

Not when a socket only last for 2 and a half years
2008 Fall to 2011 - RIP LGA1366
 
[citation][nom]zingam[/nom]Perhaps you are too young? My personal experience is that a new CPU needs new MoBo and new RAM - always! Partial upgrades suck.[/citation]
Maybe your righ, guess i just got used to the lga775.... got 2 cpu upgrades out of that one, went from c2d 6300 -> c2q 6700 -> c2q 9400
 
plus in the past i had always seen a new socket as an upgrade, im really not seeing much of a point to this new socket. its not like the pin count really changed too much...........unless they are making everything a bit smaller
 
[citation][nom]John_Dune[/nom]god, i hope that lga 1155 is a typo[/citation]

Its not. But that by no means claims it wont work for LGA1156. Remember AM3? AM2/2+ was 940 pins. AM3 is 938 pins yet works on a AM2/2+ mobo. Less pins is good. More is not.

[citation][nom]shadow703793[/nom]What about us LGA1136 users?[/citation]

They forgot to add it but Sandy Bridge is also slated for LGA1365 as a high end part. Same as above, one less pin is a good sign. if they had more pins it would be a problem.
 
Am I the only one who wants a GPU right beside the CPU?
I have almost entirely given up PC gaming and just have my Xbox hooked up to one of my monitors.

Full integration is the eventual way to go anyways, so why not get started now? Sure, Intel definitely could use some help on making a better GPU though. At least it sounds like it'll be GPGPU friendly.

I'm curious as to how the 32nm fab is really going though. From everything I can tell, a lot of people have already burned their 32nm chips, while the ole' i7 920 was damned near bullet proof (1.55V on crappy air for extended periods anyone?), some of Intel's new server chips are 45nm, and AMD's latest chips are 45nm.
I do hope 32nm goes over well though, quickly followed by 22nm.
 
Intel, doing what they do best: killing sockets.

I think I'll go AMD Octacore next time. At least they understand that not everyone is willing to ditch a perfectly functional motherboard every 2 years.
 
You can't put even a Core 2 Duo into a lot of older 775 boards (i915, early i945, early nforce 4, etc.)

You could not put Pentium D into an i915 775.

You could not put Prescott into an old 478 400FSB board.

You couldn't put a 1.3GHz P3 or 1.4GHz 370 socket CPU into a pre- Intel Tualatin/VIA 694T 370 socket board.

These were all the same physical sockets, so complaining about socket changes is moot. You needed to buy new boards anyway. Intel has been doing this for a very long time. They are a platform company, not just a CPU manufacturer like AMD was for so many years (and even after buying ATi, they still behave this way). This isn't going to change. The only thing you might see change is AMD starting the same behaviour as it would make them more money.

I'm still happy with my q6600 @ 3.3 and GTX260. I've built lots of i5/i7 PCs at work and installed a few 5770/5850 cards, but while nice, I'm still not blown away to the point I'd bother switching. Maybe adding two more 47" 1080p TVs and a 5970 would be amazing enough to spend money on.

I'm more concerned with storage tech atm. When SSDs start breaking 350 MB/s write speeds and they start saying writing data all the time won't kill them I'l replace my 6 drive array. This would be a huge noticeable difference rather than the CPUs we've been seeing over the last few years.

 
[citation][nom]zingam[/nom]Perhaps it isn't possible to make any more improvements or it is way too difficult. Could you improve the spoon anymore? I don't think so - it is perfect.Perhaps the x86 era is coming to a halt.[/citation]

Well, since the Pentium Pro, processor development has slowed down a lot, and even the Pentium Pro wasn't as big an advance over the Pentium as the Pentium was over the 486, or the 486 was over the 386. The 286 was the biggest jump.

It definitely gets more difficult to improve processors each generation. You can point to the Conroe, but consider it wasn't a derivative of the Pentium 4, but the Pentium III/M and the performance improvement isn't very great considering it came out over five years after the Tualatin.

The idiots keep getting excited by more cores, but that's far from a universal solution. Maybe they'll need to go back to the Pentium 4 design to get single threaded performance very high. I guess that's the problem they face, they can probably fix the Pentium 4 to get very high single threaded performance, but it wouldn't lend itself to multiple cores as well because it's more power hungry. The Nehalem derivatives can't really improve IPC much anymore, it seems, and run at relatively low clock speeds, but are relatively low power, so work better in many cores.

I'd be very curious what kind of clock speeds the Pentium 4 would run at on 32nm. If they added another decoder, and did other tweaks, you'd probably see very high single threaded performance. I wonder if they'll ever use that design again and fix it so it actually works.
 
[citation][nom]TA152H[/nom]Well, since the Pentium Pro, processor development has slowed down a lot, and even the Pentium Pro wasn't as big an advance over the Pentium as the Pentium was over the 486, or the 486 was over the 386. The 286 was the biggest jump.It definitely gets more difficult to improve processors each generation. You can point to the Conroe, but consider it wasn't a derivative of the Pentium 4, but the Pentium III/M and the performance improvement isn't very great considering it came out over five years after the Tualatin. The idiots keep getting excited by more cores, but that's far from a universal solution. Maybe they'll need to go back to the Pentium 4 design to get single threaded performance very high. I guess that's the problem they face, they can probably fix the Pentium 4 to get very high single threaded performance, but it wouldn't lend itself to multiple cores as well because it's more power hungry. The Nehalem derivatives can't really improve IPC much anymore, it seems, and run at relatively low clock speeds, but are relatively low power, so work better in many cores.I'd be very curious what kind of clock speeds the Pentium 4 would run at on 32nm. If they added another decoder, and did other tweaks, you'd probably see very high single threaded performance. I wonder if they'll ever use that design again and fix it so it actually works.[/citation]

I agree except for maybe the P4 part.

The change from 8086 to 286 was quite the quantum leap. I had a 286 25Mhz and my buddy's XT felt like it was a dinosaur in comparison. 386's with math co-processors and DX 486s were a big change too. Wathching those machines first run X-Wing, Links 386 then Wing Commander 2 was amazing stuff back then.

I agree on Pentium Pro too. I had a PPro 180 and it didn't feel any better than a 200mmx, even when I tried NT 4.0 which the RISC/CISC on-CPU cache PPro was supposed to excel in. Hell even the first 233 PIIs which combined the two designs didn't blow me away like the old generation changes did.

My Dual PII 333 NT 4 box was probably the first big wow in years. I could watch videos, run a quake II dedicated server minimized, fire up another quake II instance in an OpenGL window and log into the server, while playing MP3s in Winamp with OpenGL visualizations all at the same time with no slowdowns. I laughed when Dual cores became the big "new" thing. Over ten year old workstation idea made new again. Then when Q6600 came out I was reminded of my buddy's really old Quad PPro 200 lol

I don't know about P4 though. Sure the first time I built my buddy's 3.06 HT Northwood with a gig of 1066 RDRAM I was very impressed but even a 1.6 Core 2 kicks my old OC'd 3.73 P4's butt in lots of apps. It seems eve if you can get rid of the heat, Netburst can give you high clocks but diminishing returns in performance due to the deep pipeline.
 
I'm with many others on here when I say I'm glad I stuck with 775 for the time being. Hopefully by the start of 2011 I'll be able to build a best machine on the new socket(s). But like the 1156 and 1366, it'll probably last just as long...no upgrades...
 
I dont understand why Intel doesnt just use one friggin socket for each generation, instead of having a low-mid range socket and a high end socket. I just think its stupid and it may be working against them.

People want to be able to buy a mid-range CPU and then upgrade to something better down the line once prices drop. But with their stupid LGA 1156 AND 1366 system, thats not possible. Im not arguing that they shouldnt introduce a new socket with Sandy Bridge, I think thats a fine idea, but just introduce one damn socket and make it easy for everyone.
 
Status
Not open for further replies.