Nvidia Says More CPU Cores are Better (and Why)

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]rpgplayer[/nom]hate to say but windows xp does in fact use more than one core, windows xp is based off of windows 2000, which in turn is based off of windows NT, all of which support multiple processors.now as to whether they handle them efficiently is another matter. none of them are as efficient at handling threads as windows 7 is, but they still handle threads better on multiple cores better than a single core alone.a simple test for this is to open up 2 different programs and open task manager, you'll see both cores are active. the main downside to the days of windows xp, 2000 or NT is that during that time the market penetration of multithreaded programs was around 5%. meaning that in most cases running a single program was just as fast with one core as it was for 2 or 4 cores.even today less than half of the programs released are multithreaded, but multiple core count cpus are the norm so it's not as if you are paying a huge premium for a 2 core processor over a single core.[/citation]

Im talking about the OS itself not other programs running in it like games and applications that are coded for multi-thread.
It's use of a second core is minimal at best.
Windows 7 is a huge improvment over xp in use of multiple cores.
 
@Jerseyfirefighter

Part of being a tech is experimenting with different things. The pendulum sways both ways between AMD and Intel. How early was your tech career? A decade ago? A lot has changed since then.
 
[citation][nom]allenpan[/nom]i dont think the "power" cumsumption umber is wrong, assuming Power = Voltage * Current (P=VI or P=VIcos@) or Voltage ^2 * Resistance (P=V^2R) or Power = Current ^2 / Resistance (P=I^2/R)[/citation]Ohm's law does not apply to transistors since there primary function is breaking Ohm's law :)
 
I find it hilarious how some posters try to attack specifics and assume Nvidia is blatantly wrong. Nvidia is simply maximizing efficiency by using hardware in its most efficient range.
 
Isn't it amazing that nVidia is just barfing out what every tech website said about dual cores years ago? I'd love to see hwo a dual core smartphone behaves when it's idle compared to a single core when they are dormant most of the time. Because even through some people have gone off the deep end about how this will consume more power we all have seen it here on power consumption charts that even a quad core can consume less power than a dual or even a single core. Simply because only one or two cores need to even be active while the rest can be in a suspended state. And when a workload does come all the processors needed only need to blip up to max power state for a moment instead of a single or dual having to keep up that high power rating for twice to 4 times longer. A fast blip of power increase is much easier on a battery than a constant load.


[citation][nom]bejabbers[/nom]The intel chips, most of them anyways, have hyperthreading. This means, in the case of the i5, you have 4 physical cores and 4 virtual cores. So at that price point AMD offers you 6 cores while intel offers you 8.[/citation]

Yes, AMD offers you 6 real cores while Intel offers you 8 imaginary cores. Also the scaling for hyperthreading outside of benchmarks is laughable while in benchmarks is tangable, making up for as much as 30% of the performance vs HT being disabled.

Having a i3 540 rig basically given to me I tried using it as an add hoc server for some friends of mine at a co-location for gaming. With the hyperthreading I figured we might be able to run 3 separate servers ( a 32 slot TF2 pub server, a L4d server setup for up to 16 player VS if people wanted, and a fast DL so people could get maps ect in a timely manner) Wrong! any attemt at more than 2 caused instability even though the server wasn't at 50% load total. That learned we bought a quad core from AMD and without all the fake core BS we can run this as the server with 12 total servers and it's still stable!

Hyperthreading - Intels GODMODE for benchmarks.
 
[citation][nom]thearm[/nom]I had to read your post three times to under what you where trying to say. Got it now! I'll still take Intel and Nvidia over AMD any day. I think they simply make better products. I used both at the beginning of my tech career now I only user Intel and Nvidia. I'm not opposed to an ATI video card though but I've had good luck with Nvidia so I'm sticking w/ them. I'm brand loyal until I have a good reason to not be.[/citation]

My post was an answer to this.
 
[citation][nom]techguy911[/nom]Im talking about the OS itself not other programs running in it like games and applications that are coded for multi-thread.It's use of a second core is minimal at best.Windows 7 is a huge improvment over xp in use of multiple cores.[/citation]

I disagree. Windows XP will run multiple cores just fine and distributes the work between cores even for single-threaded applications. The only difference is that windows 7 is more effective at distributing the work among the cores, but your statement "its use of a second core is minimal at best" is incorrect. And I know this because I run World Community Grid on my laptop (you can look it up on google - but basically it's a grid computing application, much like SETI or Folding@home). Basically, a few years ago when I had a single core CPU on an old desktop, it would run one task at a time. Nowadays, with my dual core CPU, it runs two tasks simultaneuously, on Windows XP. I installed windows 7 on the same laptop and ran WCG again, and the performance was identical to Windows XP. Of course, this is the ideal multi-threaded program because each task takes up one thread and you can run all tasks in parallel and completely independently of each other, whereas for normal programs like games this won't be the case and you won't see the same performance benefit, and the operating system will come into play because it has to distribute the work, but my results prove that XP does in fact support multiple cores very very well.
 
[citation][nom]alyoshka[/nom]So Nvidia finally see's light????If that's the case they really need to beat the world fastest dual processor card by making a sing dual core GPU.......[/citation]
Doesn't a single GPU already have hundreds of cores? Maybe you mean like putting 2 seperate GPU chips in the same package? Not sure what you mean there.
 
show me a dual core desktop x86 CPU or even any x86 SBC (single board Computer) that can run at a full workload x264 Encode and only use at
max 5 watts total power !

the Arm Cortex dual (and quad SBC) A9/Neon 128bit SIMD can do that running x264 Encoding for instance Today.
 
[citation][nom]TechU@invcouk[/nom]show me a dual core desktop x86 CPU or even any x86 SBC (single board Computer) that can run at a full workload x264 Encode and only use at max 5 watts total power !the Arm Cortex dual (and quad SBC) A9/Neon 128bit SIMD can do that running x264 Encoding for instance Today.[/citation]

there's alot of difference between 3 ghz and 1 ghz on your power draw, not arguing architectures just arguing simple math, the more gates you have at higher frequency the higher your power draw must be.
 



lol see my rig choice, i know about that, but given the broad nature of the NV announcement, it would seem that their PR department made a boo boo big enough that we can say, NV endorses amd's way of CPU development (IE more cores, worse IPC)

no true performance enthusiast will buy AMD cpu at this time, when you are of course, not looking at a budget that limts you to the crappy lower end intel platform.

and before you ask, i got the 5870 on launch day, now if they would have released the real fermi (aka 580) when the promised it in november of 09....

oh well, next time when i buy in a year or two, the market landscape will change, but PR fscking up technical details is not going to
 
Status
Not open for further replies.