Nvidia Says Core i7 Isn't Worth It

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Harby

Distinguished
Dec 11, 2008
72
0
18,630
[citation][nom]balthazar2k4[/nom]My i7 920 @ 4ghz will easily outperform a core 2 duo at the same clock speed using the same GPU. I have a C2Q and C2D and neither come close in any game I have tried. While I agree that most games are GPU reliant, the chipset/processor/memory combination of the i7 is superior. This is simply more skewed propaganda from nVidia. I own a GTX 260 and think it is great, but I wouldn't give up my i7 for a faster GPU....[/citation]

If your pc's major role is playing games and not cpu intensive operations like encoding then yes, c2d + 285 > i7 + 260.

Thats the essence of the article and its damn right.
 

wikiwikiwhat

Distinguished
Dec 4, 2008
148
0
18,690
I was actually thinking about getting the i7...a year and a half now still. I'm running a quad core with 2 gpus on xp with 2 1920 x 1080 monitors. I have about 12 apps running right now. Memory isn't really taking a hit and neither is the CPU. I think I'm good. Hell, I don't need a SSD because i can wait a few more seconds, I don't need DDR3 because there is absolutely no need for it at all. When people buy this stuff they buy server parts without even knowing it or without considering the fact this stuff is overkill. If you are a graphic artist, animator or anything you shouldn't be buying this either, it should be your employer.
 

Caffeinecarl

Distinguished
Jun 9, 2008
308
0
18,780
The CPU does have some impact on general gaming performance. If it's not capable enough, the GPU's performance might not be completely realized. This was the case of running a GeForce 8600GT with a Pentium D 940. Once I upgraded the CPU to a Core2 Duo E8400 (with a suitable motherboard) I was getting about 50% more juice out of the 8600 GT.

HOWEVER! The upgrade I made prior to that was going from a Radeon X550 to the GeForce 8600GT. From that upgrade, my performance skyrocketed by about 700% despite the fact that the GPU wasn't being fully utilized with a slower processor. I've seen similar results from upgrading from the 8600GT to my new GeForce GTS 250, especially at higher resolutions which is where I really prefer to play. The conclusion I can draw from my scenario is that unless your CPU is drastically underpowered (as mine was) you're really better off buying a faster (or additional for your SLI/X-Fire-ers) GPU.
 

Caffeinecarl

Distinguished
Jun 9, 2008
308
0
18,780
[citation][nom]eddieroolz[/nom]Hey, that's like my config! Good to know I can get 42fps from Far Cry 2 if I install it But on topic though, it's not a surprise coming from nVidia.[/citation]

*high five* Great minds think alike! :)
 
G

Guest

Guest
Nvidia is making video cards that bottleneck systems, they turn it around and talk sh*t about the CPU when their 8800/9800/250 is to blame.

A real test would be to put a top of the line card in dual/tri/quad sli and then do the benchmark.

Nvidia talks too much sh*t,
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
amen brother, +1.
but they're shooting themselves on the foot, since it's only in i7 that allow utilize/enable the potential of most powerful cards in SLI.
or........that technical marketing director is not doing his homework.

[citation][nom]toalan[/nom]Nvidia is making video cards that bottleneck systems, they turn it around and talk sh*t about the CPU when their 8800/9800/250 is to blame.A real test would be to put a top of the line card in dual/tri/quad sli and then do the benchmark.Nvidia talks too much sh*t,[/citation]
 

cielmerlion

Distinguished
Oct 22, 2008
42
0
18,530
[citation][nom]blackpanther26[/nom]one question why do they compare the i7 965EE $1000 chip to the E8400 $167 chip. When they could have used the i7 965 at $280? Is it Nvidia wants to scare people away from the i7 becuase of the way Intel is acting latly?[/citation]

Obviously because if they used the cheaper chip the difference would be even less.
 

cielmerlion

Distinguished
Oct 22, 2008
42
0
18,530
[citation][nom]zodiacfml[/nom]amen brother, +1.but they're shooting themselves on the foot, since it's only in i7 that allow utilize/enable the potential of most powerful cards in SLI.or........that technical marketing director is not doing his homework.[/citation]

Eh, but how many people can afford that? how many people actually bottleneck their computers with multi-gpu platforms? it is a minority, most gamers nowadays game on a budget.
 

antic84

Distinguished
Mar 22, 2009
9
0
18,510
Didn't Nvidia itself market sli as douple the performance when real life boost was far from it...

It's all about marketing hype and very unique scenarios. Although true most games are gpu bound, it's most likely Nvidia is just trashing core i7 because Radeons work better with it and they don't like it.
 
G

Guest

Guest
@toalan: Nvidia and ATI aren't bottlenecking games, the laws of physics, and the limitations of a standard wall outlet are bottlenecking games. With cards running 200w TDP, they can't possibly use more juice than their already using, but if you think you know how to make a graphics card that won't bottleneck games, then you're welcome to design your own GPU.

PS: I hate Intel and Nvidia, but Nvidia is right this time, credit where credit is due...
 

sonofliberty08

Distinguished
Mar 17, 2009
658
0
18,980
every non stupido brainwashed intel fanboy and noobs know that intel are bsing , the benchmark software run good on intel cpu because intel paid them to made a software that can run good on their product .
 

radiowars

Distinguished
Feb 15, 2009
422
0
18,790
Right NOW, yes the article is right. But because a GPU can bottleneck a CPU, the i7 will last a lot longer than a GTX 285. Just because it makes since now, it doesn't mean it will work when GPUs match CPU speeds.
Just my take.
 

DjEaZy

Distinguished
Apr 3, 2008
1,161
0
19,280
... a last... some one talk's sense!!! Even... i don't just use CPU to transcode video's to iPhone... i use my HD4870 for that... a 1h:45min video in mpeg2 iTunes did with CPU in 34min... with my HD4870 tha compression was ready in 8min...
 

lashton

Distinguished
Mar 5, 2006
607
0
18,990
intel got caught talking BS and got themselves ripped a new a$$hole by nvidia, they are very worried (intel) because their Video card i think they know is heading in the wrong direction, larraby is more like a CPU card than a GPU card
 
[citation][nom]antic84[/nom]Didn't Nvidia itself market sli as douple the performance when real life boost was far from it...It's all about marketing hype and very unique scenarios. Although true most games are gpu bound, it's most likely Nvidia is just trashing core i7 because Radeons work better with it and they don't like it.[/citation]

Even if nVidia marketed SLI as double performance and it didn't offer double, they're not alone. Radeons don't offer double performance either when they're CrossFired.

Same way the dual-core works vs. single-core. No double performance gains there either.
 

SchizoFrog

Distinguished
Apr 9, 2009
416
0
18,790
I would have liked to have seen what benefit there would have been on the E8400 with the GTX260. I wonder if the leap forward would have been as dramatic with a lesser CPU.
 

ondigo

Distinguished
Apr 8, 2009
12
0
18,510
[citation][nom]sot010174[/nom]Oh yeah... so I should rebuild my pentium 2 machine and install a Geforce GTX 295 2 million gigs of ram, so it'll load vista in 3 nanoseconds.Cmon Nvidia, just because you can't even yet build a x86 processor you go banging about how a fast CPU is uninmportant. This maybe is a hint about their upcoming CPU performance. Pathetic.[/citation]

Please!!!! Read the article carefully before making wrong posts. They are talking about the comparison of the game performance not the OS loading. Do not waste our time with stupid posts!!!
 

SpadeM

Distinguished
Apr 13, 2009
284
0
18,790
Frames per second in a 3D game are on a exponential growth path if a faster gpu is used, but if the cpu changes the growth becomes just linear at best. It's not called a Graphics Processor Unit for nothing, and since games are raster based it only natural for them to scale much better with and improved GPU then a CPU. Sure, buy a i7 if u do something else besides gaming (anything multitasking related)if not there is no reason to upgrade expensive parts (CPU + MB + RAM) just to get 2 or 3 fps more. The "eye-candy" and the fps all gamers talk about isn't produced by a faster CPU. And even if games do become multithreaded ... this won't change the number of fps you get in a game since it's not the CPU that does the heavy lifting in the current raster type output.
 
G

Guest

Guest
your hd 4870x2 will always perform great unless you have something like pentium 2 333 mhz.
 

Platypus

Distinguished
Apr 22, 2009
235
0
18,680
I think a few of you may have skimmed over this article a little too quickly. Nvidia isn't coming out of no where to strike out at Intel. It is simply refuting something Intel never should have said. Turning an 80% boost in 3DMark into a statement like "80% boost in gaming performance" is outright false advertisement. I'm thinking he probably misspoke and didn't really mean to mislead.

However, the fact that Intel said it was a boost in gaming performance makes the Nvidia scenario above (you know, the one where he tests the performance on games) valid.

 
Status
Not open for further replies.