Why not two CPU's?

Treeace

Reputable
Jun 25, 2015
53
0
4,640
I'm still learning so I'd take any information on this! I'm sure the businesses that build computers have already thought of this and are not doing it because of some reason I have yet to learn. That said I'd think it make the high end computers look like toys if they added two CPU's! Is there a reason this isn't done? Will it be in or is it in the future plans?

Thank You
 
Solution
There are already dual and quad cpu systems for businesses. Some run more than this though it's very specialized for large servers or rack units. Most software is coded to use only one cpu with multiple cores and even then not all software is very well threaded. In order to use a mulitprocessor (not multi core like fx 8xxx or i7's) system the program has to be written to take advantage of it. It adds a lot of complexity to things and wouldn't be that beneficial for most mainstream programs people use.

https://en.wikipedia.org/wiki/Multiprocessing

http://www.technologyreview.com/article/418580/multicore-processors-create-software-headaches/

http://www.grammatech.com/blog/multi-core-processors-headache-for-multithreaded-code

For most...
commercial servers and super computers use 2 or more cpu's all the time. it is simply not cost effective for consumer level pc's to use more than once cpu. remember programs have to be written to handle more than one thread at a time and need even more programming to utilize multiple cpu's. to program games and office programs for it would not be worth the cost nor trouble. the OS also has to be written to handle the spreading of the workload to multiple cpu's as well. like gpu's adding cpu's does not scale directly. so 2 cpu's is not 200% as strong.

back in the day i had system with 2 cpu's (pentium 1's i believe it was) and through benchmarking it and comparing i think we saw something like a 50% increase in computer power. this was using windows NT if i recall correctly (it was over 20 years ago so it's possible the scaling amounts has changed)
 
There are already dual and quad cpu systems for businesses. Some run more than this though it's very specialized for large servers or rack units. Most software is coded to use only one cpu with multiple cores and even then not all software is very well threaded. In order to use a mulitprocessor (not multi core like fx 8xxx or i7's) system the program has to be written to take advantage of it. It adds a lot of complexity to things and wouldn't be that beneficial for most mainstream programs people use.

https://en.wikipedia.org/wiki/Multiprocessing

http://www.technologyreview.com/article/418580/multicore-processors-create-software-headaches/

http://www.grammatech.com/blog/multi-core-processors-headache-for-multithreaded-code

For most tasks the majority of mainstream users do, even quad core cpus are typically enough (or more than enough). Aside from maybe rendering video and things, which can take hours. Rendering, even on an 'enthusiast' pc is still pretty slow but it's more or less dragging a professional task to a home pc. Professional environments that need to render a lot of things (heavy workload, cgi etc) use a different type of computing, cluster computing (or render farm).

https://en.wikipedia.org/wiki/Render_farm
 
Solution
Servers back in the 90's used 2-4 CPU's, I also had a dual Pentium Pro machine for 12 years and a dual Celeron setup and that failed AMD MP which didn't last long.

CPU's don't need to be paired up anymore since they have their own multiple cores, however top end servers and supercomputers still run in multiple CPU's.