I don't really see the difference between Grid Computing and Cloud computing. The only difference, is that one was a term created by academia, and the other is a Marketing term created by IBM. Other than that they are pretty similar.
The definition for both cloud and grid computing I would use is the development of software technology that allows a user to harness the power of networked computers without adding the complexity that those systems create. The so called non-functional requirements that are also mention in Autonomous computing, a term also created by IBM. In the cloud, a programmer is able to implement a solution with no concern for the environment. The environment scales to the need of the program based "performance contract". For the user, the application would look almost the same as programs used before, because all the technology would be under the hood.
Good examples of these technologies are MapReduce, a library to distribute work on a clusters, and the Google App Engine, an API that allows you to create website, but this API will also scale your webapp depending on the traffic.
So terms are being used for marketing purposes. But since the benefits are not obvious to the end user, what other way will you sell this and get profitable in this life time ?