Personally, I would use a hosting service for 95% of the websites that I deal with. The costs, reliability, and maintenance concerns all push in that direction. The only real advantage of on-site web hosting is better control. kanewolf's comment is spot-on regarding how to decide between the two.
If you're able to tell us, more information regarding the type of website you want to host would help us make better recommendations.
If you just want to understand the thought process well enough to make the judgement yourself, I'll try to condense it down enough to be digestible. There are three pieces to the puzzle. First, you have to gauge the cost of downtime. This will guide you when choosing hardware necessary to get acceptable performance. This has a massive impact in the decisions made in the other parts of the puzzle.
The second part of the puzzle is determining the requirements of the server. These include the reliability, performance, and TCO that you need to get from the server. For reliability, you normally want something between three 9's (99.9% up time) and five 9's (99.999% up time) of reliability. Sometimes, you need more, but that's usually in very specific cases. Three 9's of reliability means you can have the server down for up to 8 hours per year. If it's only one server, that means you can generally only reboot once per week, and you'd have one chance per year to make a hardware change. When considering this number, you have to consider the possibility of hardware failure, the internet connection reliability, the occasional need to update the server, the impact of specific components failing, and the power quality in your area. In my experience, hot-swap hard drives (which means you need a RAID card) and a redundant PSU are requirements for achieving three 9's of reliability. Achieving any more than that is nearly impossible for most small businesses, as it would require backup power for the server and internet connection, and possibly a redundant server.
The amount of processing power you need in a server depends on exactly what you're trying to do with it. If you're running a basic website with no database functions, and little server side scripting, you can get perfectly acceptable performance from 2 generation old Celerons. In fact, those are capable enough to run a Minecraft server for several players. The field changes when you start adding in database functionality, heavy server side scripting, and other features. Aside from compute servers and virtualization, you normally only need a Xeon E3-1240 at most for a small-business website (I'm assuming that the server will only have one function, as is best practice), and even that is a rare thing indeed.
For TCO, you have to consider the lifetime and cost of each hardware component, the cost of electricity (including cooling), the cost of a suitable internet connection, and the cost of the time spent maintaining the system. Honestly, I've never really dealt with this specific metric much, with the exception of time spend maintaining the system and the cost/lifetime of components.
At this point, you're ready for the third piece: to look into what components are suitable for your server, and then compare costs with cloud alternatives. I've found that in general, cloud services are the better deal (often by a wide margin). If you need the control of on-site hosting, then be prepared to pay a pretty penny up front as well as for the internet connection.