Mike Friesen :
$500 + $60 per year=$980 over 8 years.
$700 for i5, 7870/r9 270, leaves $280 for a future graphics card, the CPU will sitll probably be fine. Maybe a little more, but who likes to support pay-per-month?
I kinda follow your logic, but I think you are not being realistic about the progress technology makes within 8 years.
Let's say you did what you suggest 7 years ago. Back in 2006, CPUs were way more expensive, but you could get a Pentium D950 for about 250$ since the first Core CPUs had just released. Also you could get a GeForce 7900 GT for about 300$. It came with a whopping 256 MB of VideoRAM. That is already 550 $, but let us be generous and say you can get all the remaining components (HDD, RAM, Mainboard, Optical Drive, Case, PSU etc.) for 150 $ (All those components were significantly more pricy back then.)
So, after 4 years you buy a new graphics card for 280 $, as per your suggestion. In the year 2010, you could have gotten a GeForce 560 Ti for around that price.
Now, you have a 7 year old Processor with a cheap mainboard that only supports up to PCI Express 1.1 and DDR2 RAM with a GeForce 560 Ti. You will also have maybe 2GB of RAM (feeling generous). And you are saying that using this for gaming is fine for another year? You can play all the games that release for PS3 nowadays? (PS3 released November 11th, 2006, slightly more than 7 years ago.)
I get the argument that PCs can be more easily kept up-to-date, but lets be realistic here: To keep it even remotely up-to-date during the lifetime of a console generation, you will have to spend more many than you are suggesting.
So, the only way your suggestion works, is if technological progress will slow down SIGNIFICANTLY in the game industry and pc market in the next 8 years compared to the previous 8 years. Even if it would slow down, I cannot believe that it will slow down as much as necessary to make your way work.