Intel Sampling Nehalem-successor ''Sandy Bridge''

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
That was no type LGA1156 is dead platform and Sandy Bridge wont be compatible with the current LGA1156 since it will require LGA1155 socket speaking of which Sandy Bridge is useless in my mind since it's coming as dual and quad core solution and it will be awhile before they add more cores to it and speaking of integrated graphic solution...give me a break. I prefer dedicated video card for it.

Stick with LGA1366 6core platform cause they will release cheaper variants soon :)
 
[citation][nom]lradunovic77[/nom]That was no type LGA1156 is dead platform and Sandy Bridge wont be compatible with the current LGA1156 since it will require LGA1155 socket speaking of which Sandy Bridge is useless in my mind since it's coming as dual and quad core solution and it will be awhile before they add more cores to it and speaking of integrated graphic solution...give me a break. I prefer dedicated video card for it.Stick with LGA1366 6core platform cause they will release cheaper variants soon[/citation]
if the igp will be hardware qualified for autodesk and adobe products, they'll not give a rats ass that you prefer dedicated graphics for gaming.
 
How unfortunate to have another new socket. It feels like 1366 and 1156 did not have a very long life - I wonder why they decided on a different one this time around...

 
well after socket 775 lasted for such a long time (in intel terms) i guess they felt they had to go back to their old route of new sockets every few weeks..makes me happy i'm with amd
 
I have a feeling Intel will announce a 1336 chip after this release. From all they have said in the past they plan on using the 1336 as the enthusiast platform.
 
[citation][nom]stm1185[/nom]Sandy Bridge, Cougar Point, there has to be a joke in that.[/citation]

I know, Do they actually pay someone to come up with these codenames ?
 
[citation][nom]vic20[/nom]I agree except for maybe the P4 part.The change from 8086 to 286 was quite the quantum leap. I had a 286 25Mhz and my buddy's XT felt like it was a dinosaur in comparison. 386's with math co-processors and DX 486s were a big change too. Wathching those machines first run X-Wing, Links 386 then Wing Commander 2 was amazing stuff back then.I agree on Pentium Pro too. I had a PPro 180 and it didn't feel any better than a 200mmx, even when I tried NT 4.0 which the RISC/CISC on-CPU cache PPro was supposed to excel in. Hell even the first 233 PIIs which combined the two designs didn't blow me away like the old generation changes did.My Dual PII 333 NT 4 box was probably the first big wow in years. I could watch videos, run a quake II dedicated server minimized, fire up another quake II instance in an OpenGL window and log into the server, while playing MP3s in Winamp with OpenGL visualizations all at the same time with no slowdowns. I laughed when Dual cores became the big "new" thing. Over ten year old workstation idea made new again. Then when Q6600 came out I was reminded of my buddy's really old Quad PPro 200 lolI don't know about P4 though. Sure the first time I built my buddy's 3.06 HT Northwood with a gig of 1066 RDRAM I was very impressed but even a 1.6 Core 2 kicks my old OC'd 3.73 P4's butt in lots of apps. It seems eve if you can get rid of the heat, Netburst can give you high clocks but diminishing returns in performance due to the deep pipeline.[/citation]

The Pentium 4 design didn't work because the decoders were inadequate, as well as other solvable things. IBM did a really good job with the POWER6, which was also a clock speed beast. The problem with the Pentium 4 was, it had only one decoder, and the trace cache was really small, so almost 50% of the time, it was running as a scalar processor.

It had a long pipeline, but don't forget it was double-pumped, so ran twice as fast.

For single threaded apps, a long pipeline with very high clock speeds is the way to go. IPC is great, but there's a point where you just can't get much more out of it. For sheer performance, on one processor, a longer pipelined processor would do better, but the power use/performance would probably go down.
 
Not that I'm anywhere near upgrading as I just got my computer, but it looks like in a few years I'll be going with AMD for my processing needs. I had heard Intel wasn't very faithful to their sockets, but I had never witnessed it firsthand.
 
C'mon ppl! keep your rigs longer! Why is it that ppl with i7's are complaining about a newer upgrade when their system is at most a year old!?!

...still surfin' on a 4yrs old Conroe and still feels like it's much more capable! Not much of a gamer but I run Adobe products on it like there is no end.
 
[citation][nom]mouettus[/nom]C'mon ppl! keep your rigs longer! Why is it that ppl with i7's are complaining about a newer upgrade when their system is at most a year old!?!...still surfin' on a 4yrs old Conroe and still feels like it's much more capable! Not much of a gamer but I run Adobe products on it like there is no end.[/citation]
Same here I just upgraded to an i7 930 system.lol my old system was a p4 talk about old.
 
to be honest, I see no reason to upgrade to 1156 or 1366 right now (except you are one of those bemchmark people).

I have both 1156 (have both i7 750 and i7 860) and 1366 systems and I think I made a wrong choice going from 775 to both these.

i7 is still not enough to render complex scenes. A render farm is still better. i7 is too powerfull for anything else other than rendering. For games, a simple dual core will suffice.

Or if you have a Q6600, it should be more than enough right now.
 
[citation][nom]anamaniac[/nom]Am I the only one who wants a GPU right beside the CPU?[/citation]

Yes, I want it on the motherboard. Intel Graphics sucked, sucks, and will always suck, because of their driver support. It will never be as good as Nvidia or ATI.

1156 is a joke, as it is a huge step back from 1366. Yes, it is not an evolution, it is a regression. Open your eyes! They literally raped our money. They cut PCI lanes, make SLI impossible. You stick a sound card, and your graphics slows down. It is 2010! Not 2004.
 
Status
Not open for further replies.