I apologize for this question. An AMD 8350 vs. Intel 3570k. I know that currently the Intel solution thumps AMD pretty much across the board in gaming, but my question is this: In the next 3 to 4 years, will games eventually take advantage of more cores than they do now?
My last major pc upgrade was when I bought my AMD Phenom II 940. I'm currently using that cpu oc'ed to 3.4ghz and I've had it 4+ years. I think for the same price it has weathered advancements in games and programs better than the equivalent Intel solution at the time.
I'll be mating the cpu/motherboard/ram to an HIS 6870 video card and a Thermaltake Frio cooler(Overclocking *is* an option regardless of the cpu). Once again, I know the performance differences between the two cpu's right now, anyone can google it and draw their own conclusions from the benchmarks. But what I want thoughts on, is how well will either of the cpu's age over the next 3-4 years.
My last major pc upgrade was when I bought my AMD Phenom II 940. I'm currently using that cpu oc'ed to 3.4ghz and I've had it 4+ years. I think for the same price it has weathered advancements in games and programs better than the equivalent Intel solution at the time.
I'll be mating the cpu/motherboard/ram to an HIS 6870 video card and a Thermaltake Frio cooler(Overclocking *is* an option regardless of the cpu). Once again, I know the performance differences between the two cpu's right now, anyone can google it and draw their own conclusions from the benchmarks. But what I want thoughts on, is how well will either of the cpu's age over the next 3-4 years.

