Intel Announces 'Sandy Bridge' Xeon E7 10-cores

Status
Not open for further replies.
I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.

Please, tell me what use of so many cores really is.

On a side complaint, $4600 for a CPU when ordered in batches of 1000+?
Intel would be making a BUCKET load off each CPU.

Business as usual.
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.On a side complaint, $4600 for a CPU when ordered in batches of 1000+?Intel would be making a BUCKET load off each CPU.Business as usual.[/citation]

Film composers use virtual instruments that really take advantage of multi-core processing, and sometimes 8 cores is still not enough. There are a lot of composers who buy 1-3 extra computers and use them as slaves in order to process all of the audio.
 
[citation][nom]Nexus52085[/nom]Film composers use virtual instruments that really take advantage of multi-core processing, and sometimes 8 cores is still not enough. There are a lot of composers who buy 1-3 extra computers and use them as slaves in order to process all of the audio.[/citation]
that's why we have GPU computing
 
GPU computing isn't the answer to everything. First, GPU computing doesn't help integer (it mainly helps floating point), so these will absolutely wipe the floor with GPUs for integer operations. Second, these are made for a completely different market - high end servers where reliability is critical. These aren't workstation or desktop CPUs - they're for large servers.
 
I use 3ds max and although there are render plug-ins that make use of Cuda technology, the use of the technology is still in its infancy. It has so many incompatibility problems with current material, lighting, and fx plug-ins. My current system has 24 CPU logical cores @ 3.06ghz, and 1024 Cuda cores using 2x GTX 580 phantom 3gb cards. Production rendering using Cuda cores and Iray is damn fast, and has greatly improved final render times when working on photo realistic architectural scenes, but the quality of the render takes time to match the render quality produced using mental ray using CPU cores, and you have to start a project completely from scratch using materials and lighting etc that you know will be compatible with the iray plug-in. Because of this, it makes it impossible to use Cuda technology for most of my work, as clients send me files using materials, lighting and plug-ins that do not work using iray, so for the time being, as many CPU cores as possible is a must for my work, and 3ds max makes use of all 24 CPU cores in production rendering. Production rendering with 24 CPU cores is still blisteringly fast. Nvidia claim that using their Cuda technology can be a lot faster than using Multi-core CPU’s, but that depends entirely on the system its being compared against, and the application being used. And another problem is power consumption. The two CPU’s I have installed, have a max TDP of 95w per CPU, the two GTX 580 cards have a max TDP of 488W, so there is still a lot of work to be done as far as getting GPU’s to be as power efficient as CPU’s. The Companies that are most likely to buy these new Xeon’s are large businesses that run massive data centres, server farms, and production studios, where low power consumption and reliability are critical, and GPU’s score very badly in this area. My system is prepared for the future, and I am very sure that one day when the use of Cuda technology has matured, that GPU production rendering will be more popular than CPU’s in this task, but that day is not yet, at least not in my line of industry. Iray also uses the combination of GPU and CPU technology together, so there is nothing wrong with having as many CPU cores and GPU cores in one system as possible.
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.On a side complaint, $4600 for a CPU when ordered in batches of 1000+?Intel would be making a BUCKET load off each CPU.Business as usual.[/citation]

Two Words for what a processor like this is used for: Virtual Servers.

With that many cores and enough memory, the one processor could effectively act a 10 different servers where you only need one core for the virtual server.

Pretty much this is a heavy RoI/low TCO processor.
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.On a side complaint, $4600 for a CPU when ordered in batches of 1000+?Intel would be making a BUCKET load off each CPU.Business as usual.[/citation]

Ummm??? A Server? Duh!
 
dodidont is totally right i work on vfx side 2, and cuda is just...to new...
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.On a side complaint, $4600 for a CPU when ordered in batches of 1000+?Intel would be making a BUCKET load off each CPU.Business as usual.[/citation]

As balister said, you don't have to use all 10 cores for the same OS install, you spread them out for use as multiple Virtual Machines (VMs). Xeons are designed for servers (and sometimes high-end workstations) after all.
 
[citation][nom]hotmetals37[/nom]Check your facts, Tom's! These E7 processors are built on "previous generation" architecture. Here's the link from Intel to prove it:http://newsroom.intel.com/communit [...] -c1-265964[/citation]
It says that these build on the previous generation, but that doesn't necessarily mean it's not the current generation. I'll give a reason of a doubt, though.
 


He's right, Tom's got it wrong. Look at the "Quick Links" section on the right of

http://ark.intel.com/Product.aspx?id=53568&processor=E7-2803&spec-codes=SLC3M

"Products formerly Westmere-EX"

Tom's is probably confused by the fact that some of the E3 Xeon chip family (notably the 2- and 4-cores) are Sandy Bridge derivatives:

http://ark.intel.com/Product.aspx?id=52278&processor=E3-1280&spec-codes=SR00R



 
CCP (makers of eve online) just dropped 50 grand to be the first company to use this chip. fighting the war on lag with the latest servers!
 
1000 10core CPUs for the price of ONLY $4,616??? Cmon people do the math!!! THATS FREAKING CHEAP!

That BEATS THE CRAP OUT OF core-i7 990x 6core cpu for 1000$ you'd only get 4 cpus with 6 cores/12 threads each. But for the server side, you get 1000 cpus with 10 cores/20 threads each!!!!

It comes out each E7-8870 cpu is worth 4.6$ what a STEAL!!!!
 
[citation][nom]flowingbass[/nom]1000 10core CPUs for the price of ONLY $4,616??? Cmon people do the math!!! THATS FREAKING CHEAP!That BEATS THE CRAP OUT OF core-i7 990x 6core cpu for 1000$ you'd only get 4 cpus with 6 cores/12 threads each. But for the server side, you get 1000 cpus with 10 cores/20 threads each!!!!It comes out each E7-8870 cpu is worth 4.6$ what a STEAL!!!![/citation]
Hahaha, im just yanking your chain...
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.[/citation]
When you read the article using your eyes to see, and your brain to interpret the letters, you see, at the end of the very first sentence, that it is aimed at business applications, specifically, "high-end computing applications, including business intelligence, real-time data analytics and virtualization." Do you need instructions on how to read the paragraphs that follow, reiterating and detailing this?
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.[/citation]
Tom's, can you please reverse the order comments are shown? Newest should be at the top, not the first, most idiotic post made as this poster has done. Having the oldest comment at the top ensures the least thought out post that didn't even read the article makes it to the top.

We should also be able to down rate stupid comments so that they are pushed even further back so only quality comments are viewed. Tom's is just feeding the trolls the way they have it now.
 
Here is the thing GPU's are really parallelizable machines, but so parallel that if one of your threads is slower then the others, guess what everything slows down until that one thread is finished which mainly happens with if/else statements. Also the clusters/stream shaders are dependent, hence why when one thread is slower all of the GPU slows down. With CPU's they are independent so one core is independent from another so if one thread is slower then another the other cores are not hindered. Also CPU's have the ability to access random areas of memory whereas GPU's cannot which is helpful if you are doing a database or something else. Not to mention that the CPU has worlds more instruction support over GPU's especially intel gave their sandy bridge processors new accelerated AES instructions. So I am trying to say is this GPU's are great for apps like video/imaging, but CPU's are better for everything else.
 
[citation][nom]joytech22[/nom]I can't think of any reason why you would need so many cores, anything that needs that many cores for visual, animation and physics is better off using a GPU for anyway.Please, tell me what use of so many cores really is.On a side complaint, $4600 for a CPU when ordered in batches of 1000+?Intel would be making a BUCKET load off each CPU.Business as usual.[/citation]

Wow...
 
Status
Not open for further replies.