Nvidia Says Core i7 Isn't Worth It

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]DjEaZy[/nom]... and tha tests show... if tha resolution gets higher, then tha difference between i7 and AMD Phenom 940 iz not so noticable or not at all... tha GPU do tha physics now, video compression... http://www.guru3d.com/article/amd- [...] iew-test/1http://www.tomshardware.com/review [...] ,2278.html[/citation]
Physx is done on the GPU, Havok is done on the CPU - this is the way that both engines have been from the start (with a version of Havok being written for GPU right now as well). It depends on which physics-rendering engine is being used for a particular game.

btw, tha = the
 
All I can say is that nVidia is lying trought their theeth. Don't belive me? Check out how performance scales in Crysis when modifing the available number of CPU cores in windows. ->

http://www.tomshardware.com/reviews/multi-core-cpu,2280-10.html

While using only 1 core, the performance of the test sistem drops 60%. This clearly indicates that a faster CPU is recommended over a high-end graphics card.

I have been building PCs for over 10 years, and this has been my real-life experiance as well.

With a E2160 (1,8GHz, 2MB L2 Cache) CPU / 4GB 800MHz CL4 / MSI P45 / 4870 1GB graphics card, you get around 28 avarage fps (8!! min / 48 max) in crysis @ 1680x1050 / high graphics settings, no AA, 16x AFF.

When we slap on a E8600 (3,3GHz, 6MB L2 Cache) the frame-rate rockets to ~38 fps. With a Q9550 OC @ 3,4GHz (2,8GHz 12MB cache), the avarage framerate grows to 42 FPS.

In my opinion, the CPU bares more than the CPU. Mostly because games like Crysis are CONSOLE PORTS, witch run emulator components in background.

Of course, if you you a low end card, it doesnt matter if you an P4 or a i7, your gaming experience will suck.

My advice is to buy a mid-high or high-end CPU (Like Intel's E8400, Q9550 or even i7 920, AMD's Phenom II 720 or Phenom 940), and a mid end or mid-high end GPU, like (AMD's 4830, 4850, 4870, or nVidia's 9800GTX, GT250, GT280).

I guarrany that if you run an expensive high-end GT285 with a low end CPU like a Pentium Dual Core E2160, you will not be able to take full advantage of your GPU, running at ~ 30-40% of total GPU power, and wasting your money.

Another factor to take in consideration, is the fact that CPUs evolve considerably slower than GPUs. Just look at the reign of the Core 2 Duo. The core i7 launched 4 years (if i'm not mistaken) after the c2d launched. As for GPUs, just look how many models launched in the period between the c2d launch and the i7 launched.
 
Hmm - can't edit. Or i don't know how...

A few small corrections - i men GT260, not 280 in paragraph 9, and i didn't type in "have" in paragraph 8... "it doesnt matter if you HAVE an P4 or a i7"

Sorry.
 
I think first of all you have to keep it in context; this is about *gaming performance*, not multitasking or photoshop, etc.

And for gaming I think we can all agree that for mid-range and high-end systems alike, performance is most dependent on GPU.

What I'd like to see -- an article titled "How low can you go?" with benchmarks comparing systems with different CPUs and the same GPU (e.g. Core2 Duo w/ GTX 260 vs. Pentium 4 w/ GTX 260, etc.) I think that would really paint a good picture for how and when a bottleneck actually occurs in the real world, and educate gamers on a budget for what they can expect with low-cost PC parts.
 
I own AMD myself, but by bringing out this monster of a CPU, the i7, aren't they looking more into the future of gaming/computing? With nVidia coming out with a new card every couple months, they are just looking into what people will by for the next short while, not making a purchase for the future and will keep performing nicely. Notice how there still isn't a stock CPU that can beat an i7 yet?
 
Nvidia:
We have no use for your GPUs, now stop over charging us for the performance we get or else ATI will be your punishment!
 
Status
Not open for further replies.