Sharing the load with three computers/ motherboards

Leonell12

Honorable
Apr 8, 2013
629
0
11,360
Hi guys, so I have a bit of a project on my hands, I'm not terribly experienced in this sort of thing however this is interesting. So, I recently put together two desktop systems together, very cheap used parts and both of them are running Linux Ubuntu. I also have a laptop running Windows 10. What I wish to do, or hope is possible, is to share the load of one program (say Minecraft with tonnes of mods) across the three systems. Perhaps hooking them up some way or using the two desktop systems as some sort of a server (slave/ master?) while using the laptop as the interface device. I mean, I don't really know what the possibilities are, all I want is the ability to share the load (cpu/ processing wise) across the three systems. Any ideas/ suggestions/ pointing me in the right direction is welcome!
 
Solution
Nothing really possible here. Software has to be designed for clustering or load sharing. That involves a master system sending packets of calculations to other systems, having them processed, then returned and compiled into a completed piece of work. Doesn't really work with game clients and servers in general.

Large MMOs use this type of technology. Other examples would be render farms for 3D work. And supercomputers.

2/3 of those are 100% custom. And only really simple jobs can be handled in that way.

Internally it is similar to GPU computing, where each core on a GPU is more or less treated as a single system doing part of the work.

Eximo

Titan
Ambassador
Nothing really possible here. Software has to be designed for clustering or load sharing. That involves a master system sending packets of calculations to other systems, having them processed, then returned and compiled into a completed piece of work. Doesn't really work with game clients and servers in general.

Large MMOs use this type of technology. Other examples would be render farms for 3D work. And supercomputers.

2/3 of those are 100% custom. And only really simple jobs can be handled in that way.

Internally it is similar to GPU computing, where each core on a GPU is more or less treated as a single system doing part of the work.
 
Solution

Leonell12

Honorable
Apr 8, 2013
629
0
11,360


Hmm, I see. What about services like OnLive? Do they use similar custom designed software involving a master system etc.? Linux has Ubuntu Server as an OS as well, can that be used somehow?
 

Eximo

Titan
Ambassador
OnLive is a little different. That is similar to an Enterprise class VM structure. So they have powerful computers running spawnable guest OS with games pre-loaded. I presume they are deleted at log off. In the enterprise more and more servers themselves are being virtualized. A single computer with a heavy duty multi-core processor and lots of ram might be hosting all the servers in a company. Dumb terminals also take advantage of this, all processing is done server side, the 'client' is just a basic computer running a single client application.

I assume OnLive is taking advantage of the recent introduction of GPU virtualization. This allows the GPU cores to be split up based on demand similar to the way CPU cores have been virtualized for a while now.

A server OS is not much different from a 'client' OS. They really do the same things. Such OS are specifically designed for more automated workloads though, such as webhosting, file hosting, acting as firewalls, game hosting etc. Security is often at a much higher level as well.

Effectively you have a limited load sharing with any online game. The server is tracking all player data and player input and routing it to all other players, as well as environmental information. Generally servers do not run the game's graphics engine and only serve the data required for the game clients to function together.

Haven't had a good technical topic on here in a while. These are always fun.
 

Leonell12

Honorable
Apr 8, 2013
629
0
11,360


Hmm I came across a little something over here https://en.wikipedia.org/wiki/Beowulf_cluster , apparently something like this can be set up using OSCAR http://svn.oscar.openclustergroup.org/trac/oscar and connected all together via LAN. Thoughts?

 

Eximo

Titan
Ambassador
Beowulf cluster is an old concept, and more or less the basis for large scale computing on the cheap. (There was a cluster made of PS2 gaming consoles turned into a Linux cluster at one of my local universities)

You would still need custom programming to actually handle the data to be processed. That does provide for the communications protocols, but not any actually useful programming. You aren't going to be able to take the Minecraft client and server and distribute anything meaningful. Any added time lag would likely be detrimental to performance actually.

I could vaguely see a concept where you distribute the memory required for a large scale Minecraft environment, but it would be more efficient to simply add memory to the host system.
 

Leonell12

Honorable
Apr 8, 2013
629
0
11,360


Hmm, alright seems like nothing can be done. Anyways, thanks for all your help! I really appreciate it.