Nvidia Says Core i7 Isn't Worth It

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
[citation][nom]toalan[/nom]Nvidia is making video cards that bottleneck systems, they turn it around and talk sh*t about the CPU when their 8800/9800/250 is to blame.A real test would be to put a top of the line card in dual/tri/quad sli and then do the benchmark.Nvidia talks too much sh*t,[/citation]

I can attest to 2 285gtx not being bottlenecked by a 3.16ghz core2duo and thats a pretty beefy system imo. I never bothered with tri/quad SLI, but i'd imagine that with a core2duo set up, you would still see more FPS increase from going tri / quad SLI than you would if you went from core2duo to core i7. I'm pretty positive of that.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
[citation][nom]resonance451[/nom]Wow, they countered skewed information with skewed information. Congrats, Nvidia.[/citation]

You think that CPU really makes a huge difference in gaming performance? If you do, then you don't know much about gaming rigs.

a computer with a $100 CPU and a $300 GPU will ALWAYS run games better than a $300 CPU with a $100 GPU.

The info that Nvidia put out was not "skewed" nor is it any sort of surprise to people who know how to build gaming rigs. You ALWAYS get a much better FPS increase from a GPU upgrade then you do a CPU upgrade.

Now the great thing about PC's is that they are NOT just for gaming, and there are tons of great things that a PC can do that shows huge improvements from CPU upgrades. But Intel was not talking about general CPU speed performance, they clearly said "gaming performance
goes up by 80 percent when you use a Core i7 chip.". That is complete false advertising.

Stop being such a damn fan boy.
 

Master Exon

Distinguished
Jul 20, 2008
292
0
18,780
I had an Athlon 1600+ and a GeForce 5600 128MB. They were great. I upgraded to an Athlon 64 3500+ and 7600GT 256MB. My performance skyrocketed. Then I bought an Athlon X2 4400+. I noticed quite an improvement in my game's framerates. Later I bought a Radeon HD 3850. My framerates went crazy again. Then I bought a 45nm C2D 2.5Ghz and GeForce 260. The only thing I noticed was that Windows and Winrar run faster.
 
I guess you could say it's another case in favor of the value of the Phenom II :D. Anyway, it does depend on the game and settings. After all, nVidia's own GPUs have shown significant improvement on an i7 platform verses a Core 2 Duo/Quad platform when multiple GPUs are involved. This is really just a case of both companies trying to justify a user spending more money on their product.
 

lire210

Distinguished
Mar 23, 2009
39
0
18,530
as much as it pains me to say it i agree with club n with this one but i do not think i will buy either of these guys product them companies are shifty. i plan to upgrade my cpu to a phenom 2 cuz my phenom 1 is like a pentuim and pentuim suck.
 

MarkG

Distinguished
Oct 13, 2004
841
0
19,010
So basically, what Nvidia is saying is that the i7 CPU _is so super fast that Nvidia GPUs can't keep up with it_

Seems a bit of a dumb-ass thing for a GPU company to say publically, but ATI seem to have been making all the smart decisions in the GPU world lately.
 

zendax

Distinguished
Apr 15, 2009
87
0
18,640
Counterpoint: A core i7 920 and decent X58 motherboard is not THAT expensive. If you're spending $1200 or more on a gaming rig it makes sense in the fact that you can throw in a new GPU a couple years down the line, but still be on a processor and platform that has more than enough power.

LGA 775 is going the way of the dodo. If you were to build a new computer on the platform now, upgrade options would be limited. Even if my shiny new i7 ends up leaving me wanting in a couple of years, I'm on a platform that's going to be sticking around for a while.
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290
[citation][nom]1971Rhino[/nom]I had no idea I had a "BMW" in my case.......awesome![/citation]
too bad i cant just put my 323i in my case....
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
I fully agree with Nvidia on this, but not everyone is using their PC for games (I think? I could be wrong though, no sarcasm here). You benefit more with the i7 for doing graphics and music encoding and using compression tools among other applications. Benchmarks on this very web site will show you that all you need for games is an E8400 right now with a decent video card. If you want the computer for gaming, save a load of cache and get a core2duo. I'm wondering if part of this is to try to sell more 790i chipsets, but whatever. I bought one of those, but it was mainly to get away from the horrible company Asus which now likes to send defective RMA motherboards back to my business. I'll never buy Asus again for home use.
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
[citation][nom]antic84[/nom]Didn't Nvidia itself market sli as douple the performance when real life boost was far from it...It's all about marketing hype and very unique scenarios. Although true most games are gpu bound, it's most likely Nvidia is just trashing core i7 because Radeons work better with it and they don't like it.[/citation]

I've seen benchmark showings double the performance of frames per second gained by using SLI mode of geforce 9800 GTX+ video cards in games like Crysis. I'm not sure where you're getting that from, maybe some cards don't do as well I don't know. I may SLI some day but hell, a single nvidia card with a core2duo is plenty for games these days.
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
You don't need the most expensive GPU in fact I heard there's a lot of people complaining about the 285's and such... when you get roughly the same performance out of a 250 or 260 and don't have all those problems, why spend $500+ on a video card that will cause you grief.
 

norbs

Distinguished
Feb 23, 2009
229
0
18,680
wait which hummer are we talking about? The full our army one or the one you buy cuz your a douchebag? or wait are you talking about theking your girlfriend gives??? now im really confused...
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
norbs, you can only buy an H1 used now as GM no longer manufactures it for sale to the public. That leaves you only with the H2 and the H3 hummer to buy at a Hummer dealer new.
 

thenetavenger

Distinguished
Jun 1, 2006
10
0
18,510
Ok, this is a GPU/CPU fan war, and it really isn't this simple, and NVIdia is really right and really wrong.

GPU people, PCs are not Consoles, people run other applications at the same time, often times with high end MMOs even multiple copies of the game at the same time. And the core 'multi-tasking' is where the CPU IS IMPORTANT.

Up until now games have been designed for a one to two threads going through the CPU for basic AI and processing, and the rest because of the brilliants of OpenGL and DirectX gets shoved through a multi-pipe GPU with amazing processing.

The trouble with NVidia's statement is that they leave no room for the 'future' where CPU and multi-core CPUs especially can start taking the load off the GPU for some things or even give the game more headroom when the users is running Outlook and Burning a DVD and listening to music at the same time.

The next generation of games are moving to better CPU reliance and with that you will get better AI, and things that tend to get left behind so that people can show how many pixels they can throw on the screen at once and how 'pretty' it is.

Additional, GPU People, there is not as 'massive' leap in video card performance as NVidia wants you to beleive.

Go back to a 7900GTX NVidia Geforce, the card is 5 years old. Yet on many tests, it will outperform an 8600, or even 8800 on older games or a Geforce 9600.

NVidia would be more correct if they were talking about 'higher end' cards with their examples, and the Geforce 250 is not a high end card compared to the 260 or 280 etc... (And again, on some tests, for games made around 3DMark 2001 or 2003 timeframe, a Geforce 7900GTX will outpeform it at higher resolutions.)

If you really want to see the 'slow' advances in GPUs in the last 5 years look at the notebook market. Go to a site like notebookcheck.com and look at the top performing mobile GPUs, the 7950m is faster than most of the 8xxx and 9xxx series cards in every test, and it is only the 'exclusive' high end 9800m or 8800m cards that can compete with a 4 year old video card. (And sadly the Geforce 7950M is faster than most of the desktop GPUs NVidia is still producing, including a desktop 8600, etc.)

----

Now for the CPU people...

If someone wants a 'quick' bang for their buck, the $600 investment in a i7 is NOT GOING TO BE WORTH IT, at least not for a few years when multi-core code in games is better and CPUs do have more burden on them.

Literally if you have a P4 3.4ghz w H/T, moving from to a newer GPU that is in the 'upper range' is a better investment. Spending $150 bucks on a Geforce 260 would be 5x the improvement over the Geforce 6600 or 6800 your P4 probably came with.

This will also buy you time until the i7s come down in price, and then you can leap to them when you can afford it, and get the best of both worlds.

CPUs are a weak spot, but game designers don't design around CPUs, and they tend to design their games with one to two threads (which H/T can usually handle), and until they do put better threaded code in games, the i7 is not a great choice.

However, moving to a Core 2 Duo is not a good choice either. If you look at single thread or even dual thread peformance comparisons, a P4 3.4ghz processor is only about 20% slower than the upper end Core 2 Duos, and to get that 20% increase, it is a lot of money that you could save and put into a good GPU w/PS to power it and wait around until i7s drop in price, or get a deal on a Core 2 Duo.

-----------

NVidia is right that GPU is a better bang for the buck for 'single application' gaming right now.

NVidia is wrong that this statement has long standing truth or is universal to all users, especially uber geeks that are already running several copies of LoTRo on the screen at once, or playing a game and encoding a DVD in the background. The GPU is only going to help with the multi-game example a bit, and again this is going to be based more on the OS architecture, as under Vista or WIn7, this is easy to do with the GPU RAM Virtualization, but under XP, OS X, or Linux - running multiple games like this with a composer fronting the actual drawing just doesn't work well. And Vista needs some CPU power and threading to handle this, in addition to is AGP/PCI GPU memory tricks.


So for now, yes ok, NVidia, we get you, upgrade a Video card and wait for CPUs to drop in price, that is unless the GPU requires a new mainboard and a new PS to make it work.

However, the future is just around the corner and gaming that needs and uses more advanced AI and CPU processing for this, that makes the 'pretty' things on screen become alive and real, a CPU is pretty damn important. (And no GPU PhysX processing is not a complete answer to this either, as right now, many companies aren't even using the DirecTX10 or other 'standards' and with NVidia locking up PhysX support to their cards, games are not going to spend a lot of time or money on making it a grand as it could be.)

- NVidia would be smarter to work inside the DirectX10 or DirectX11 models that support non-visual and physics calculations than to continue to try to run with their own and then shove it into the DirectX10 standard when gamers continue to give them the finger.


(PS Tom's do you think you could fix your site so that IE8 works properly? If I fake the IE8 header so you think it is FireFox, it works just fine. Just saying...)
 
G

Guest

Guest
The title comes a bit strong, saying the "corei7 isn't worth it".
I didn't see any claim saying anyone said that. Kind of a press twist of Tom Petersen's version.
I'm sure he would have preferred a corei7 over a Dual/QuadCore. It's just that you can reap more benefits with a better graphics card.
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
Pentium 4's with hyperthread suck for gaming, I know...I own two. 3.4ghz did they even make one like that? highest I've built is 3.2ghz.. core2duo is above and beyond fast compared to a pentium 4 ht in everything.
 

SuperCruise

Distinguished
Apr 12, 2009
4
0
18,510
There are many reasons why GPUs provide far better bang for the buck than Core i7. One overlooked reason is that not much general purpose software has been parallelized, so you're not really taking advantage of four cores or eight hyperthreads in the Core i7 - you're just running one core slightly faster, and paying absurd prices for the privilege when you go from a 920 to an Extreme. Yeah, right, four times the price (in round numbers, from $250 to $1000) for what? 25% more performance? Spending only two thirds of that range, a mere $500, to get a GTX 295 and you get more than a teraflop of computing power.

More importantly, games that utilize Nvidia GPUs can take advantage of the 480 cores in that GTX 295 - they are getting the benefits of parallelization. To an increasing degree, general purpose programs like PhotoShop and Mathematica and many others are utilizing Nvidia's CUDA to run *in the GPU* at supercomputer speeds. There are hundreds of programs now that use CUDA and can execute in the GPU. If you've ever used one of those (I have) you instantly realize that even a really inexpensive, $150 Nvidia card makes the fastest Core i7 look like it is standing still. If you are coding one of those applications, you also know that the Core i7 is so incredibly slow in comparison it mostly just gets in the way. G 200 series GPUs with 240 stream processors in them are *so* fast that for serious, high performance computing, they have relegated even the fastest Core i7 to not much more than being a keyboard and disc controller.

And yeah, sure, considering that the key computation for gaming is going on in the GPU, you bet an AMD Phenom II x4 is a much smarter buy than a Core i7. Instead of wasting your money on a Core i7, get the newest Phenom II x4 and spend the difference on dual GTX 295s running quad SLI.





 

SuperCruise

Distinguished
Apr 12, 2009
4
0
18,510
The thenetavenger's comment... "people run other applications at the same time"... has a point, but it is a limited one in that it almost makes a virtue out of the failure of most applications to run parallelized. Let's see... are we really going to spend $1000 on a Core i7 Extreme so some dumb background activity is going to get the same funding as our gaming? Don't think so. Whatever background stuff you have going will do just fine with whatever timeslice out of a Core 2 Duo or, better still, an AMD Phenom II, Windows chooses to give it.

We all understand you're going to have more than one thing going on at once. But the folks in this forum don't normally give a hoot about whatever non-interactive thing is going on in background. They want their game, their graphics arts softare, their whatever is happening in foreground run like a cosmic skyrocket. That's what speed is about. The folks in this forum don't spend $1000 for a Core i7 Extreme and then overclock because they want lots of background processes to run a hair faster. They're doing it because they want their gaming action to be faster than anyone else. The best way to do that is exactly what Nvidia said. Put the money into the GPU where it has cosmic effect, not into overspending more and more for some one-fourth share of a totally dumb and slow CPU.



 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
I think that if one hasn't upgraded from their HT cpu yet, then going to an i7 920 is a safe bet. Prices have come down on some motherboards. However, I spent very little money upgrading to core2duo E8400 and this system runs a lot better than my expectations. Probably partially because I didn't opt for a crummy Asus motherboard this time.
 

SuperCruise

Distinguished
Apr 12, 2009
4
0
18,510
One last thing... Although raw speed of computation, Floating point Operations Per Second (FLOPS) is not everything, raw FLOPS power has a way of driving all else before it in those applications like gaming that do a lot of computation. A $1000 Core i7 Extreme is about 70 gigaFLOPS. A $500 GTX 295 is over 1000 gigaFLOPS. Big difference in bang for the buck.

 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
[citation][nom]burnley14[/nom]What a shock: a GPU manufacturing company tells you to spend more on GPUs and less on CPUs. That sounds completely unbiased . . .[/citation]
True, but it isn't anything that the educated gamer doesn't already know. I dislike Nvidia for many stunts they have pulled over the past couple of years, but they've got this one right!
 

Fail Complex

Distinguished
Apr 17, 2009
12
0
18,510
What about when OPENCL comes out? Doesn't matter it when mutltiple cores come out? Especially when DirectX 11 comes out, that games can gain some benefits even if the games are made in direct X 9? A more CP-GPU, GPGPU? Are "DX11 hardware only" is mostly a feature call "compute shaders" that makes DX11?

What about the "Infernal Engine" where it can benefit using fully utilize on the CPU?

Source:

Infernal Engine: The video game physics engine & gameplay Testing Video Stress Test, 2500 Box Drop:
http://www.youtube.com/watch?v=H9boF-JZKcU

OPENCL with multi-cores:
http://www.youtube.com/watch?v=mcU89Td53Gg
 
Status
Not open for further replies.