Nvidia CEO Shares Company's CPU Strategy

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

proxy711

Distinguished
Jun 5, 2009
366
0
18,790
I'm saddened by the fact that technology is slowed because a company just doesn't like another enough(or is too greedy) to sell them a license(at least at a fair price) to give them the right to produce something.

It's time to fix the copy right system.

Oh and ta152h ever hear of larrabee? I've never designed a gpu or cpu, but if intel, the king of cpu's, failed at making a gpu then i would have to come the the conclusion that gpu's are harder to make then an cpu.
 
To be honest most of Nvidia's chipsets sucked and still do while Intel charges a kings ransom for their over priced and limited X58. 12GB ram limit oh please and the i5 pci-e lanes are a joke some of us would like to tri fire or tri sli.

Hotter than Hell Edition
 
G

Guest

Guest
Regarding Ohim's comment "Same about CPUs the entry cheap CPUs makes the profit where AMD is better than Intel", whats AMD's answer to Intel Atom/Atom2 etc. power and performance-wise? If anything, the net-top market has very little left for AMD's Geode (or similar). Intel has been the king there right from the start. I remember AMD was hesitant in going into the net-top market, possibly due to fear of cannibalization (which Intel must have worried about too since their Atom chipsets are quite dumbed down).
 

decrypted

Distinguished
Apr 16, 2010
59
0
18,630
x86 tech is getting pretty long in the tooth. Could Nvidia bring ARM to the desktop? Meaning as more and more can be written to the GPGPU, I would think that the CPU would become less needed for the system.
 

awood28211

Distinguished
Aug 1, 2007
204
0
18,680
Four words that makes me feel like this is going south... "Does anyone remember Cyrix?"

More competition but I don't think they'll compete in the market without starting at the base. Changing tech to compete seems like a stretch.
 
G

Guest

Guest
@decrypted - That's what I think too. I think ARm will make some inroads into desktops and maybe even servers soon. In fact servers may be sooner. A 6 or 8 core ARM processor at 1.5 GHz taking up miniscule amounts of power could be quite attractive in that market.
 
G

Guest

Guest
@decrypted - That's what I think too. I think ARm will make some inroads into desktops and maybe even servers soon. In fact servers may be sooner. A 6 or 8 core ARM processor at 1.5 GHz taking up miniscule amounts of power could be quite attractive in that market.
 

jkflipflop98

Distinguished
[citation][nom]superblahman123[/nom]That should be today, but they gotta stretch their profits as far as they can with as old of technology as they can. If I see any processor manufactured to 5.0Ghz within the next 10 years, I will eat my shoe, it just won't happen as long as everyone thinks that more cores is what will speed their computers.[/citation]

You better get ready to eat that shoe. It's not that far away. . .
 

drutort

Distinguished
Aug 30, 2007
162
0
18,690
i have to say this is very smart move on nvidias part to go into the ARM area, i mean the tablets even though there kind of useless but the phone market is huge, i would love to see ati and nvidia make there way into the phone area and something like dx10 that can accel the basic OS say on android would rock! (on a bigger scale then present) along with awesome power saving tech this can look really good for the avg consumer (us)
 

f-14

Distinguished
intel throwing their cpu with on die graphics is a pretty great idea, how ever intel graphics are going to be 8 balled in comparison with ati or nvidia. the business market won't care unless they need the graphics for cad or animation. and businesses already have had that investment in ati.
my guess is nvidia can lisc. out their gpu chip to intel and milk it like rambus does and or looking into partnership with via for the chipset and pine technologies or sun microsystems for the cpu (have yet to read a review on sun's SPARC64 VII quad core 2.4 ghz 5mb cache cpu's yet, but with a price of 2100.00 dollars it might be awhile.)

so until intel takes a 3 generation leap in GPU technology at todays current present day technical level they are going to have to suffer handing over a big portion of the market share (to amd/ati) before they realize they need to do something like lisc. nvidias chips to throw on-die. atleast the DOJ made intel stick with keeping pcie on the chipsets, but i bet intel planned to tank nvidia and then pick them up on the cheap before putting their chips on-die. the only thing hurting intel's on-die cpu/gpu chips are amd's better power consumption and price for intels i-3/i-5 segment vs. amd's more faster cpu's and faster low end gpu's.
and as i have stated before if amd gets its act together and throws radeons on-die with the cpus they will put serious hurt on intels entire i3/i5 series.
i don't know how feasible it would be to get a universal chipset on-die also for full intergration, but who ever gets there first & best; will probably rule them all. it will probably be an extremely large pentium pro sized chip and extremely difficult and very complex, but it can be done, but probably not cost effective in todays current market state.
 

tethoma

Distinguished
Jul 20, 2009
60
0
18,630
Is this supposed to be good news? My favorite video card manufacturer, I have been loyal to for over a decade is starting to let me down.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
[citation][nom]politoni[/nom]Regarding Ohim's comment "Same about CPUs the entry cheap CPUs makes the profit where AMD is better than Intel", whats AMD's answer to Intel Atom/Atom2 etc. power and performance-wise? If anything, the net-top market has very little left for AMD's Geode (or similar). Intel has been the king there right from the start. I remember AMD was hesitant in going into the net-top market, possibly due to fear of cannibalization (which Intel must have worried about too since their Atom chipsets are quite dumbed down).[/citation]
When i say entry level i don`t say Atom. And atom doesn`t outsells the casual computer users CPUs. Atom is ultramobile CPU not entry level consumer that has a desktop PC at home.
 
[citation][nom]meat81[/nom]i know i am beating a dead horse but i would have loved to see Nvidia's X58 chipset offering.... F-ing Intel[/citation]

I wouldn't have. nVidias past few chipsets for Intels Core 2 series were just plan bad compared to Intels chipsets. And the only reason to get a nVidia chipset was if you wanted to SLI. If not, then it was better to go with Intels.

But as for their CPU, I think it will be good to have Inteel and nVidia pushing the mobile market now. Will really make VIA and Motorola push for better stuff.
 

dertechie

Distinguished
Jan 23, 2010
123
0
18,690
[citation][nom]sirdilznik[/nom]@decrypted - That's what I think too. I think ARm will make some inroads into desktops and maybe even servers soon. In fact servers may be sooner. A 6 or 8 core ARM processor at 1.5 GHz taking up miniscule amounts of power could be quite attractive in that market.[/citation]

Servers I can see, desktops would be harder. They have to deal with x86's biggest advantage: a shedload of legacy code that won't run without emulation. 90% of it will never be ported. Equivalents will of course emerge quickly, but until nice, comfortable Windows and Office get ported it may not make much headroom. Thing is, if an ARM and Atom system stand next to each other at the same price, where the Atom runs Windows and the ARM takes 3W less and runs Android modified for a desktop, the Atom is going to win with the general public, because there's no learning curve.

In servers, they'll just throw some descendant of Unix on it. So long as what it's processing doesn't require great speed it'll work.

However, all of this is contingent upon ARM actually having better performance per watt than an equivalent Xeon/Opteron setup. If it takes half the power per server but takes twice as long per server, you haven't gained anything (same total power to do the task), as you need twice as many servers (which means more support staff as well). It's the same problem Atom based servers face: optimization for minimum power use is not equivalent to optimization for maximum power efficiency.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
[citation][nom]ohim[/nom]Slow down intel fanboy , the big profits are not in the high end market as you might think, it`s on the entry part but as usual ppl always go buy from the company that has the fastest producs even though they will not that buy that in particual. AMD/Nvidia > Intel GPU but still Intel outsells each of them in IGP sales. Same about CPUs the entry cheap CPUs makes the profit where AMD is better than Intel but they only lack marketing and huge founds for Advertising.[/citation]

I guess you don't really understand much about microprocessors. AMD processors are the same size as Intel's, but perform much worse, and are actually bigger compared to Intel's 32nm ones.

Therefore, AMD can only compete by charging very little money for their processors, despite them being expensive to make, because they are large.

So, you're wrong. They are the catfish in the pond, sucking up the muck that Intel doesn't go after. Because Intel chips at the same performance level are much cheaper to make, they could destroy AMD's processor division anytime they wanted to, or at least seriously impact it.

So, right now, AMD's processors are not a great source of profit, they might even be losing money on them. Selling a big processor for low cost isn't the way to make money.

Also, despite your inane rantings about where the money is, Intel still commands over 80% of the CPU market. So, they are selling the volume, and at a good profit.

AMD is butt-up right now, and NVIDIA doesn't want to join them. And more than that, they'd be fighting for the scraps Intel throws AMD. Intel is too formidable right now for NVIDIA to want to compete. Not only are their designs superior, their fabs are the best in the world. NVIDIA will try areas where they aren't, or where they are, but suck (i.e. video, or chipsets for Atom).
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]ta152h[/nom]That assumes a company is a person, and isn't more capable of doing more than one thing at a time. That's always the fallacy with that type of thinking. They were separate divisions. Also, NVIDIA does not set the standard for GPUs, ATI does. Intel forcing NVIDIA out of the chipset market didn't make ATI make better GPUs, therefore your conclusion that it had a negative effect on Intel's GPUs is fallacious. Also, Intel still sells more GPUs than NVIDIA or AMD. IGPs still dominate the market and probably will more and more.[/citation]

actualy fi you knew what you were talking about you would not have sounded so stupid here. GPU's are acutaly far more complex than general cpu's . it's afact that the entire comptuer industry knows , the more specilized a chip is the more complex it gets . this is especially true with gpu's since thier die size has to be broken up into a larger number of task
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
Imo some other type of CPU should raise since x86 is kinda Intel exclussive ... AMD ha it`s break on licensing though intel tried to push AMD out of it .. it`s been made clear over the years that intel doesn`t like competition at all. look at their pricing in areas where they don`t have competition .. 1000$ CPUs ... seriously... Who knows maybe in time other CPU architecture will raise and put a threat to the currend x86 that is reaching it`s limits anyway....
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865
[citation][nom]jplarson[/nom]It's a real shame that Intel is able to hold onto the x86 licensing... that effectively creates a monopoly. Granted they own the intellectual property of it, but it costs the market competition.[/citation]

Intel controls licensing for x86...and AMD licensing for x86-64. Neither will license to nVidia....nor would I.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
Yeah I kind of miss those nforce680 boards that were monsters back in the day. They were the true SLI x16, X16 boards. Frankly Intel and all of the other new LGA1366 boards owe it to Nvidia for bringing out this innovation for full pcie lane support. Even the P55 cant live up to those standards.
 

milktea

Distinguished
Dec 16, 2009
599
0
18,980
The right moves for Nvidia. They simply can't compete without the CPU market. And pretty soon they would try to integrate their GPU into the CPU. No offense, but that is just where the future is heading. Discreet GPU would soon be a thing of the past. *haha*
 
G

Guest

Guest
Will the write a emulator for x86 code because every piece of desktop software is written for x86. Will not run on ARM
 
Status
Not open for further replies.