yomamafor1
Distinguished
The way I see it is, the actual time gained using an i7 in encoding/graphic work is completely overstated.
Even if you work at that day after day, chances are you will actually do the part that matters in terms of benchmarks very little. You know, hit f5 or whatever and render the actual scene. You spend a *lot* more time idling or working on the scene than rendering it. Even when you do render it, chances are you go have a smoke or a cup of coffee - i7 or not.
By the end of the day you might have saved a whole 3 minutes. The only market where performance actually counts on a real basis is gaming - everything else is just totally synthetic in comparison.
Or in your opinion.
For those who actually have a profession of rendering, encoding and the likes, Core i7 usually shows 20~30% improvement over its competitions. Given the fact that it was originally designed as a server oriented CPU, it really justifies the cost when you run CPU-intensive programs (HPC for instance), as it was shown eating two Shanghai alive with rooms in the mouth to spare.
But of course, for someone who thinks "gaming" is the real market, there's not much to argue about.