Desktop vs Server - Whats the difference?

Servers are generally built for solid durability/endurance and redundancy (data or/and power and sometimes hotswappable), some servers also have a tooless modular design for ease of service and upgradability.
 


In a nutshell, servers are designed for reliability, throughput, and memory performance. A typical server will have more CPUs/CPU cores than a desktop, a lot more memory, and more peripheral I/O than a desktop but usually the CPUs will have a clock speed that is slower than a typical desktop CPU and a server will cost a LOT more. You also can't overclock servers for the most part either.

You can play games on a server and do all of the same things you can do on a desktop as long as the OS is the same and you have a suitable graphics card in the server. Most dedicated servers do not have a graphics card; they are actually not usually even connected to a monitor. My "desktop" consists of a quad-CPU server board and it plays games and such just fine. But you could buy a reasonable desktop system just for what my motherboard cost. However my system is very, very good at highly multithreaded tasks (it can have up to 64 cores, 16 memory channels, and 512 GB of RAM...it should be fast!) and would run circles around a typical desktop in those tasks, which is why I got the server gear.