Nvidia CEO Shares Company's CPU Strategy

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

porksmuggler

Distinguished
Apr 17, 2008
146
0
18,680
Great timing for a "strategy" announcement. Marcus linked the article, but didn't mention why Huang was being interviewed. From reading Yam's posting, you would think all Nvidia's woes were Intel's doing. Now why oh why would Nvidia be pushed out of the chipset business...

http://www.fudzilla.com/graphics/graphics/graphics/nvidia-posts-141-million-loss

I wouldn't want my CPU platform marred by another companies quality issues either.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
[citation][nom]demonhorde665[/nom]actualy fi you knew what you were talking about you would not have sounded so stupid here. GPU's are acutaly far more complex than general cpu's . it's afact that the entire comptuer industry knows , the more specilized a chip is the more complex it gets . this is especially true with gpu's since thier die size has to be broken up into a larger number of task[/citation]

I guess you know absolutely nothing about processor design, and, obviously spelling.

It's one thing to be ignorant, and another to be insulting and ignorant at the same time.

GPUs are MUCH simpler than CPUs, especially x86 CPUs. GPUs are good at very simple, parallel, workloads, and the design tends to be very much many of the same over and over again.

GPUs don't have to worry about changing x86 instructions into easily executable instructions. They don't have to worry about branch prediction, Out of Order processing (and particularly memory disambiguation), or any of the incredible complexity of instruction level parallelism.

Even a normal RISC CPU is much more complicated than a GPU, although Niagara probably isn't that much more complex.

Please, get some knowledge before you post your nonsense. Someone might actually think you know what you're talking about, and repeat it.
 
G

Guest

Guest
"so x86 chips ran too cool for them?"

...ever heard of a Pentium D?
 

victorintelr

Distinguished
Aug 17, 2010
187
0
18,680
[citation][nom]superblahman123[/nom]If I see any processor manufactured to 5.0Ghz within the next 10 years, I will eat my shoe, it just won't happen as long as everyone thinks that more cores is what will speed their computers.[/citation]

Actually Intel and AMD pretty much stopped their "fastest processor" competition when they got just above 3Ghz and were able to put 2 or more cores in a die, then they focused more on the quality of the processor rather than raw speed.
 
G

Guest

Guest
Also i hate Intel was forcing us to buy their bad useless graphic chip also expensive chipset
 

somata

Distinguished
Apr 8, 2009
16
0
18,510
[citation][nom]superblahman123[/nom]If I see any processor manufactured to 5.0Ghz within the next 10 years, I will eat my shoe, it just won't happen as long as everyone thinks that more cores is what will speed their computers.[/citation]
Actually, IBM had POWER6 CPUs shipping at 5 GHz a couple of years ago. It may have been a relatively simple 2-way in-order design, but it's still an impressive feat, especially at 65nm.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
[citation][nom]ta152h[/nom]I guess you know absolutely nothing about processor design, and, obviously spelling. It's one thing to be ignorant, and another to be insulting and ignorant at the same time. GPUs are MUCH simpler than CPUs, especially x86 CPUs. GPUs are good at very simple, parallel, workloads, and the design tends to be very much many of the same over and over again.GPUs don't have to worry about changing x86 instructions into easily executable instructions. They don't have to worry about branch prediction, Out of Order processing (and particularly memory disambiguation), or any of the incredible complexity of instruction level parallelism. Even a normal RISC CPU is much more complicated than a GPU, although Niagara probably isn't that much more complex. Please, get some knowledge before you post your nonsense. Someone might actually think you know what you're talking about, and repeat it.[/citation]GPU`s are so simple to make by your reasoning that it doesn`t make sense why did Intel failed with Larrabee, not to mention that intel sux at grapichs in general ... so it doesn`t make any sense they are able to produce the most advanced complex CPUs and they are stupid enough not to be able to compeat with a much simpler design in GPUs against ATI/Nvidia. Get real.
 

mlopinto2k1

Distinguished
Apr 25, 2006
1,433
0
19,280
Man, I wish I knew the WHOLE picture about who is screwing who with these licenses and trademarks and whatever's.... it's BS. Total BS. If you knew Intel was slicing the heads off of some creature, so that they could make their chips.. would you still buy them? Ehhh..
 

kansur0

Distinguished
Mar 14, 2006
140
0
18,680
I think the whole Larabee thing was an example of a company whose greed and ego got in the way of what they thought they could do. They probably even knew that they were failing at the time but were too proud to admit it. The greed and ego of the company also wanted to use scare tactics by using stilted demo's of an unfinished product to see if they could generate any interest and maybe even shake up the stock market a bit making investors in nVidia worried about competition and to jump ship. Luckily there was enough questions asked to uncover the lid on that can of sh!t.

I am really glad Intel failed with Larabee. The way they do business it would have only been a matter of time before they would use the war chest to pack briefcases full of money to system makers. Here's a little incentive to put our cards in your systems. Forget about those other guys. Don't tell anyone where you got this money. Just tell everyone that our new product is the best you have ever seen. Sell the turd...we will cover your losses.

Yep. Good old fashioned capitalism. Don't you just love it?
 
G

Guest

Guest
Intel failed with the larabee because they tried to make a gpu out of a mess of p4's crammed into one PCB, they took something relatively simple and made it complex and it made sense financially because they already make cpu's. but the hardest part about that project was the software that was met with delay after delay
 

mikem_90

Distinguished
Jun 24, 2010
449
0
18,780
[citation][nom]meat81[/nom]i know i am beating a dead horse but i would have loved to see Nvidia's X58 chipset offering.... F-ing Intel[/citation]

I'm not sure Nvidia would get the whole book on how to work with Intel's CPUs. I recall Microsoft doing that with Windows, I wonder how hard it would be to do that for a chipset. Part of me wonders if there was some "But First, before passing stack byte Y, do X" instructions that got left out.

Is designing a chipset to work with a CPU as crazy complex as thought? It does kind of sound like it. Via made some in the past, and often was very Hit and miss with theirs. SiS did as well, but generally were sub-par with anyone else. Even Intel had their own Chipset problems around the time of the Pentium II/III.

I'm just curious if Intel had laid some booby-traps in the design of the CPU or documentation to be a roadblock to Nvidia or others. Considering their past, the tricks they've pulled, I would not be surprised.
 
G

Guest

Guest
Nvidia is smart, they're going after the mobile electronics market. Consumers are eating up everything to play all their media. having enough processing and gpu power is gong to become increasingly important with streaming HD and 3D content. Depending on how AMD does on Fuzion, i think Nvidia is going to corner the market.
 

rohitbaran

Distinguished
The end of nVidia's chipset business was a real bad thing for the company and consumers alike. While it is good that they are attempting new stuff like x86 CPUs, they will most likely be pulled into legal battles by Intel over patents.
 
Status
Not open for further replies.