AMD giving away $8000 worth of Magny Cours (4P rig, 4x12 cores)

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
More like running programs to utilise cores to actually finish the job faster. Every time I double the number of cores I halve the time it takes to render (roughly). If it's a 1 minute render on one core then that's not really worth it, but if it's a one hour render on one core then 48 cores (assuming the same architecture) would take 1-2 minutes. A 24 hour render would take about 30 minutes.
 



I just thought that the /drool comments over the giveaway machine were... rather humorous... given your stance on multi core computing. 😉 Figured a little bit of friendly ball busting was in order. 😀
 

Just to utilize the cores? Nope. I actually would run the program to get the results. I use solidworks all the time, including its flow simulation capability, and the extra cores would significantly speed up the computation.
 



Well.. I suppose one could point out that the circular logic works both ways. Someone could also make a blanket statement that "nobody" runs programs that can effectively utilize such an environment. And then utilize that flawed logic to justify a theory that a highly multicore/multireaded PC is a waste. Of course, that person would then quickly find themselves in the position of having to tip toe around and downplay the examples which *do*.

So - In the case of mom/pop/gramma surfing the Interwebz and emailing videos of a monkey scratching it's butt, sniffing it's finger, then doing a back flip... sure, a Quad is overkill. But in the case of an enthusiast pushing heavier duty programs and/or heavy graphics/video/rendering, a Dual would needlessly prolong what is already a large task. So where more horsepower is freely available, why not take full advantage?

So neither position is entirely accurate, since the context in which the components are used matters greatly towards the outcome. Though if I may be permitted to opine: Since this is an enthusiast site, I note the general population here are skewed more towards the latter than the former. And also that - since this is an enthusiast site - the general population here should by and large be savvy enough to know the difference without being told.
 



An enthusiast != a person who does rendering. A person who does a job professionally, then yes. As I have stated several times. Every time I make a suggestion regarding cores it always comes down to : future proof/rendering

This same logic was pushed for the Q6600 and now its an outdated CPU as all the rest are. Future proofing is a failed concept. As for more cores, absolutely there is a need. There are server markets out there. There are rendering professionals that work for Pixar. Don't even bring up multitasking because that is limited to ram. Even on a 3.8 E8500 + 2 gigs of ram I can run a game, have 10+ tabs open in Firefox and play music + download whatever. Its so silly that people think that where Intel and AMD have pushed the CPU directions is the RIGHT direction. More cores != better especially given the software. Give me a higher clocked dual then give me quads when software can utilize it. We were just recently given windows 7 which can utilize more than 2 cores yet these silly arguments on behalf of quad core users has been pushed since 2007.
 


I know exactly what I would do with such a unit. In fact, I intend to get a unit like this in a year or so when I can pick it up for a reasonable price like $2000 for board + CPUs rather than >$8000. Basically everything CPU-intensive I do with my machine responds very very well to more cores, so getting a big multi-socketed SMP system just makes sense. What I would gain from having that many cores:

- Number One would be speeding up video encoding. Editing and encoding low-def DV video into something sane like 2 Mbps H.264 takes a huge amount of time. Doing that with 720p or 1080i MPEG-2 is even worse as it works at about 3 fps. I won't even do any more of the HD stuff until I get a new machine as it takes so ungodly long to work with. x264 is very well-threaded and a 48-core setup would chew through that like a beaver on crack.

- Number Two would be to speed up compilation. I run Gentoo Linux and even with my menagerie of junkyard systems all participating in a clustered fashion (via distcc), compiling any program of reasonable size such as OpenOffice takes for-fricking-ever. GCC compilation is also typically very highly-threaded and 48 local cores would help that out a ton.

- Number Three would be to speed up compression and decompression of large files. I do work with image-based backups and I run parallel BZIP2 to compress the raw hundreds-of-GB images down into something of a much more manageable size like 10-30 GB. That takes a ton of CPU power and is another highly-threaded application.

- Number Four would be to speed up the random scientific stuff that I run from time to time as part of research. The CFD stuff I did a couple of years ago was the worst as that sucked up RAM and CPU cycles with an insatiable appetite. Being able to use relatively inexpensive and fast single- or dual-rank 2 GB DIMMs to get a suitable amount of RAM because you have 32 memory slots versus 8 or 12 with a typical 2P server is a big advantage. Larger dual-rank modules are pricey, and quad-rank modules absolutely kill memory speed.



No. These CPUs are standard-TDP parts as far as I can tell and will probably consume something around 80 watts apiece when they're fully loaded. The chipsets and VRMs will probably eat 30-50 watts and each stick of RAM will use about five watts. Then add about five watts per HDD for idle and twice that for full load. I'd guess a reasonably-outfitted 4P 12-core Magny-Cours server without a bunch of HDDs would take about 500-600 watts under full load. If it was in one of the $2000+ storage server cases with 32 HDDs, it still probably wouldn't draw 1000 watts. The only systems out there that can actually draw 1600 watts are some of the biggest, hottest eight-socket machines stuffed full of HDDs and RAM, with a few 40-watt hardware SAS RAID cards thrown in for good measure.

Your system probably doesn't pull 300 watts on idle unless you have several highly inefficient GPUs and a highly-overclocked CPU cooled by something other than air. It would pull way over 500 watts at full load, if that were the case. The 500 W full load I can believe as a high-end GPU and an overclocked quad-core CPU can pull that from the wall pretty easily. But it would probably idle at more like 150 watts rather than 300 watts.



If you have 48 cores, run one instance of Folding@Home with the -bigadv and -smp flags. You'll use all CPU cores to work on the same WU and you will get a humongous bonus in the process that will far exceed the production of 48 uniprocessor clients. I'd guess a 48-core Magny-Cours could probably hit 75,000+ PPD if it were running 24/7. A system with eight Opteron 8356s gets about 55,000 ppd and that's the old Barcelona cores with less than 1/3 of the HT speed (a big issue with an eight-die machine) and half of the memory bandwidth.
 




Some of the reasearch I have done has been in a cancer lab. That was actually some of the most computationally-intensive stuff I've done as I ran much of it on a several-million-dollar, 512-CPU cluster. It would have been a godsend to have a 48-core local machine to myself at that point as it took forever to get anything queued up and run on the cluster.

<pulls out soapbox> That being said, computers are only a tiny bit of what's needed to find effective treatments for various cancers. Probably the best thing we can do to hasten that is to get more children interested in the sciences and medicine so they go into those fields and do good work when they grow up. The sciences are unfairly portrayed as overly difficult and people who are interested in them are viewed negatively (nerd, dweeb, etc.) by the media. Schools sort of perpetuate that as elementary school teachers are almost never science majors and sometimes do a poor job at teaching it, so student aren't interested. Junior high and high school science teachers may be science majors, but by that time, students learn what's really important is sports and the social life instead of anything academic.

College doesn't help either as college is increasingly a place for 18 to 22-year-olds to party, drink, and screw around to "get that out of their system before they get a real job" rather than as a place to learn. It's hard to screw around and go out drinking six nights a week when you are taking difficult science classes, so a lot of people major in something else that doesn't require a lot of work. It's no wonder why few choose to go into science fields today, particularly when you can often make more money with less work in other jobs. Go look in upper-level college science classes and labs and who do you see? Almost always, it's a bunch of people who come from somewhere else where the sciences are promoted and not Western countries. The bad part for us is that they're starting to go back home rather than stay here and help us do things like try to find cures for cancers. All right, rant over. <puts away soapbox>
 

It depends how often you plan to keep the system. Someone who upgrades every 1.5 years does not need to future-proof. Someone who upgrades every 3-4 years should probably get something a bit faster, subject to their primary uses of the system of course. The Q6600 is still well ahead of an E8400 in multi-threaded tasks so someone who upgraded to it at launch and doesn't plan on upgrading their system often got an awesome chip, provided they can use all 4 cores. If you plan on using only two cores, don't buy four.


It is absolutely the right direction with current technology. Bigger, hotter, faster cores are far less efficient than multiple smaller chips doing smaller amounts of paralleled work. It's impractical to expect frequencies to get much higher because the MOSFET is simply too power-hungry. What is really needed is to drop the 40-year-old transistors that we're using and move to something that uses far less power. Unfortunately the market demands backwards compatibility.



I don't know where you got the idea that Vista couldn't use more than 2 cores. The thread schedulers are pretty close between Win 7 and Vista. XP is pretty far behind, but being 8 years old it's expected. You could always use Linux if you wanted thread scalability in the OS itself.

On another note, expecting all software to be multi-threaded is a bit narrow-minded. Some tasks don't lend themselves to parallelism, and others wouldn't benefit enough from it to justify the high complexity of multi-threading a program. It's not a simple task.

Anyway, this is a competition. If you don't want 48 cores then don't enter. I'd still want it even if I couldn't use it, but I'm not from the US.
 



+1 seems like such a lame response to this, yet I could not say any of it better.