Intel's 32 Core, Quad-HyperThreading Super Chip

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
artificial inteligence should be prohibited. make the computer better, but dont give it free will.
 
"Four of these were used in the Wolfenstein demo."
"These" refer to 4 Knights Ferry servers, each with 32 cores running 128 threads? Wolfenstein must be a really awesome game!
 
[citation][nom]kronos_cornelius[/nom]Most multimedia is highly parallelization.[/citation]
I think you meant to write, "most media applications are embarrassingly parallel."

Video rendering falls into this category because at the very worst, each individual pixel can be easily isolated as a single task. So at 1920x1200, you've got A-OK scaling up to at least 2,304,000 threads. Even assuming endless scaling flatly following Moore's law, even the 1,600-SP Radeon 5870 is over 20 years away from hitting the limit there. And even then, I'm sure there are plenty of ways to increase the splittable load, such as through anti-aliasing or perhaps even full-scene motion-blur.

[citation][nom]kronos_cornelius[/nom]The real problem is bandwidth. at about 40Gb/s, The main memory chips may start to have problems supplying enough data to keep the processors humming. So, applications that are cpu heavy would be favored instead of data-heavy. I think this is where Intels fiber optic connection would come in to the rescue since most science projects involve Petabytes of data.[/citation]
I dunno if a fiber-optic connection is really going to be necessary. Modern video cards manage to handle just fine, with their memory subsystems capable of handling over a petabit of data per second.
 
[citation][nom]gnesterenko[/nom]"The views expressed here are mine and do not reflect the official opinion of my employer or the organization through which the Internet was accessed."[/citation]

loool, but overall a very nice comment,you seem to be reading my mind in some of the statements.

[citation][nom]jdamon113[/nom]I generally Like what intel has to offer, To me the past month it sound like intel is out of ideas, buying up secters of the market and to hell with graphic, eventhough they are the largest and richest processor company in the world, but they cant make a video card so here is a boatload of cpu's to make us look cool.Anyone who has see this befor, I would wager in the next year AMD will take the thrown again. I think intel is lost for words right now.Let the prices drop.. intelif this is your market, your will loose the game, Yes I ment that pun. Stop all the crap. Make a video system or buy Nvidia and let move on becasue this is unreal. 5-10 years from now, than we can look at something like this.[/citation]


great comment


 
I really hope that at some point they are able to deliver a mass market (Graphics card) based version of this that allows hobbyist access to this technology at an affordable price.

At the moment I can see them releasing a product at a competitive price to the Nvidia Tesla boards e.g. ~$3000US. A price which compute hobbyists can not afford.

 
I can imagine a huge virtual machine or Citrix farm with these beasts if enough RAM can be installed. I wonder what the heat disipation is for one of these chips.
 
I don't know if this has been said... but Intel did deliver on that 50 core promise. It might be a little late, but I'm am freakin happy to see this chip. I would love this thing for BOINC. LOVE IT!!!

On a side note, AMD... PLEASE have something up your sleeve.
 
[citation][nom]gnesterenko[/nom] [...] Intel realizes that's where the $/growth is at, abandons performance (letting the platform stagnate) [...] [/citation]

So you think they had no clue where the "$/growth is at" prior to this?

I don't know what booboo you're talking about, the last few years were good for AMD but for intel it was their best ever, I remember they said recently they had their most profitable quarter ever.
 
[citation][nom]JOSHSKORN[/nom]Like, Oh...my...God, Becky. Look at those chips. It's so fast! It looks like one of those gamer guy's girlfriends. Look at those threads. There's just, so many of them.[/citation]
You like fast chips and you cannot lie?
 
parrallel processing .... can any oen say render farm.... when these suckers hit CGI graphics power is going to sky rocket i bet , though they will liekly take jsut as long to make , since the guys making the cgi movies will likely just throw more visual work loads on them
 
[citation][nom]gnesterenko[/nom]@jdamonEhh, Intel made a booboo. THey are big enough (and the market is uninformed enough) to absorb it this time around, but they did. The booboo is Nahelem (did I spell that right?) released in the consumer market/desktop segment with its socket change, triple channel memory (useless for home applications). Don't get me wrong, the processors are monsters, but the problem is, they are still monsters - and they will remain that even with SandyBridge out. By creating these monsters, they gave AMD free reign in the value segment (where the $ is at) and what allowed AMD to come back from the brink in the last few years.Now, Intel realizes this, and hence, Sandy Bridge. Lower power, dual channel, integrated graphics, and cheaper. Sure, new sockets again, but that's to be expected from Intel by now. With these offerings, Intel's got AMD in the cross-hairs again - they are trying to recover from the BooBoo (and will succeed, probably). However, AMD beat them to the punch with their Fusion chips which have been recently benchmarked and are awesome. Intel still has the performance crown - AMD has nothing to match the 6-core i7s - and hence there has been no progress in this segment. Competition is moving to the value segment, and as I've said, by persuing Nahelem, Intel let AMD gain significant ground in this middle segment. Being an AMD fan, this makes me very happy (and it should make Intel fans happy too as competition is good for everyone). Hopefully, Intels new focus on the middle ground and AMD's excellent performance in this segment will allow AMD some time to breathe and allow them to catch up again in the performance sector with Bulldozer-derived Zambezi... hopefully...So in conclusion, Intel goes for performance, gets it, AMD gives up on performance, nails the middle/low/server segments, Intel realizes that's where the $/growth is at, abandons performance (letting the platform stagnate), and chases AMD into the middle/low segments, but too little too late as AMD releases a new generation before Intel does, so now they are going to be 4-6 months behind.I think the most amusing part about the Booboo however, is that despite Intel letting the platform stagnate, they are upping the high-end chips every once in a while with new extreme editions (at $1000/pop) - even though they are simply results of improved yields and quality of said yields. This allows them to make up for the booboo (again, at $1000/chip!!) by milking the crowd that actaully is silly enough to upgrade every time something faster comes out."The views expressed here are mine and do not reflect the official opinion of my employer or the organization through which the Internet was accessed."[/citation]

Nahelem was a huge success on desktops. Remember LGA1156 and LGA1366 are both Nahelem based architectures. LGA1366 was never meant for general home applications, but more power users and enthusiasts.

Intel didn't give AMD free reign, look at their low end pentium's and the heavy marketing convincing people they need something faster. Alot of people go for laptops over desktops and I don't know about where your from but here in Belfast most laptops use i3 processors. Only in the last 3 or four months has any of the retailers actually started using AMD processors in more than one model. Builders and people in the know like yourself will go AMD for value but the consumer knows intel. Intels marketing machine has far from let AMD have free reign of the budget sector.

I understand what your saying but I disagree with it. Yes Intel should be worried about AMD's Fusion APU's but its not as black and white as that. Intel are expanding into SoC markets and atoms are being adopted for lower power useage and even smart phone market. Bobcat will eventually challange in that market but both atom and bobcat have their uses in the server market.If anything performance, value, low end just doesn't hold up anymore for describing the market since the lines are getting blurred.

You also can't blame intel for wanting to make money selling their processor for $1000 after all they are a buisness and I seem to remember AMD doing something similar with a dual core...... Its what the market can bare and you always pay a premium for new tech especially at the top end. A six core intel challenges server boards and brings another option to power users than xenon settups which are far more expensive. Personally a hex core on LGA1366 that can be overclocked than their better binned but fundamentally even more expensive xenon brothers.

Theres more to performance than raw power. Efficiency, integration of GPU, are intels plans for this tock. I donno where your getting the idea intel has abandoned performance. ATM AMD need more cores to match intel in performance. They may focus less on raw clock speed but they're still gonna give top end performance unless AMD's bulldozer pulls something out of the bag. In which case we won't know till its benched and in any case Intel won't sit still and roll over. If anything AMD is behind (how long after intel did 45nm come out?) and their performance is only that of intel's now discontinued core2! Yes AMD will release first, but it will have to make up the performance gap before its ahead as well as counter intels next release.Thats alot of ground to make up!

 
[citation][nom]scook9[/nom]So they made a GPU? That is what it looks like to me lol. It just does not do graphics....it does everything.[/citation]
No, not a GPU. There's a huge difference. GPU's can handle only a few very specific types of instructions, but it does them very fast. CPU's can handle a MUCH wider variety of instructions, but not as fast. This chip is most definitely a CPU, although they've applied some design characteristics that are more commonly used with GPU's.
 
[citation][nom]awood28211[/nom]I think multi-core processing will be more evident if given the ability to divide the cores into logical workloads. Must everything run on core1 that's single threaded until the OS decides it's too bogged down? I want core 30-32 to run my games, core 20-29 to run all my single core apps, dividing each processor to one per core unless I exceed all 9 cores in use. give me the 1st 5 cores for my OS and services. Put the rest into anything else that is multi-threaded. Let ME configure it. There is NOTHING that's more of a pet peeve to me than my machine using near 100% of 1 cpu and 2% of another all the while making every application I own come to a crawl....just because the OS thinks it knows how to handle it.[/citation]

you can do that, you have to set it up manually though, i forget how but i use to do it with my T2600 in 06'. i'll post back if i can figure it out again
 
[citation][nom]requiemsallure[/nom]you can do that, you have to set it up manually though, i forget how but i use to do it with my T2600 in 06'. i'll post back if i can figure it out again[/citation]
go to task manager and set the affinity to the processors you would like to use for the specific process.
 
Status
Not open for further replies.