Nvidia: Expect x86 Processor in 2-3 Years

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]jkflipflop98[/nom]NICE! I just got a virus off one of the banner ads on this page. What the hell?[/citation]

Yea my AV went crazy on here a while ago. I would never expect anything like that here.
 
[citation][nom]roofus[/nom]Let me guess..Will it be yet another way to recycle the G92? lol[/citation]

Only if they could. At least it would make the G92 chip dirt cheap mass production and all.

[citation][nom]roofus[/nom]Yea my AV went crazy on here a while ago. I would never expect anything like that here.[/citation]
That's why i changed my hosts file to block most ad's gets most of them at least.
 
[citation][nom]roofus[/nom]Yea my AV went crazy on here a while ago. I would never expect anything like that here.[/citation]

Get firefox with no script and adblock extensions. Works wonders for me.
 
even though Intel is more than 10x the size AMD/Nvidia combined, Intel's criteria for being successful is their high profit margin. Any extra competition in any segment directly into the x86 market could spell disaster for Intel and some very unhappy shareholders. Since Intel and Nvidia holds more of a grudge towards each other than ever, this makes perfect sense; if not to just piss off Intel.
 
So what happened to the GPU will power the PC?
No matter which way you look at it, most pc programs are written for a x86 cpu and to be able to enter the CPU market you processor will have to support that instrction set. I would be pissed off if I bought a new CPU and most of my software didn't work cause it was not x86.
 
Intel is investing more in a single US Fab than NVidia makes in a year. I love NVidia... they've got some solid GPU designs and they're by no means slouches when it comes to pumping pixels, but it'll take them MANY years and BILLIONS in investment to be able to field a processor that could even compete on the low end of the market - they're not going to be giving AMD and Intel "A run for their money" anytime soon.

 
@curnel_d
You can thank AMD for that massive screwup. If AMD hadnt rushed the gate on 64-bit support when it didnt matter one bit, we would likely have true 64-bit solutions atm.

C'mon... Really? And I suppose with Intel at the helm of the 64-bit development revolution, at this very moment we'd likely have a consumer affordable "true" 64-bit solution--not based on under-performing, infrastructurally nightmarish IA-64 tech--and we'd completely avoid encountering the same (or worse) problems currently facing broader implementation/adoption of 64-bit?

Wait...

You know, you're completely right! We'd be SO much better off than we are right now with this rushed, hack-job, x86-64 screw-up designed by AMD--the very extended architecture, mind you, that Intel is licensing instead of pushing its own 64-bit scheme...

Yeah...points, mate... truer words ne'er spoken...
 
I'd love to see those low power processors being used in a larabee like (parallel) environment!

It'd be cool to have an all in chip on a board the size of a number keypad of the keyboard, with 5 sockets placed there.
Just plug and play another CPU chip to work in parallel (upto 3), and plug upto 2 of the same chips for graphing processing aiding!
I'm sure a system like that (using round about 12W) will have plenty more of processing power than current 2Ghz Core2Duo's.
 
So eventually there will be a all INTEL system (intel CPu + Larabee), all AMD system (AMD + ATI) and a all Nvidia system (nvidia GPU + nvidia x86 cpu).

It's sad that nvidia doesn't have the cash to just buy out VIA. But yet, again most of us already know that this will be a remix of the G92 or G94 core. Nvidia has finally barked up the wrong tree, and since intel holds the X86 licenses they'll have to learn to kiss arse. To get this x86 build out the door.

But once (and if this happens) we'll have competition on a whole new level. Which means for us the consumers, better prices for a complete system.
 
I'm sitting here imagining an nvidia gaming computer!
3x dual gpu graphics on a 3sli board running an nvidia cpu, and attached to a local nuclear reactor!
3x dual gpu = 6x extra 75w power + 3x from the pcie socket = 675W
nvidia board tdp = dunno but I suppose 40W or so
nvidia cpu = 400W probably, and sporting like 5 billion transistors
After all nvidia's way of making stuff faster is just by adding more of the same stuff.
 
[citation][nom]neiroatopelcc[/nom]I'm sitting here imagining an nvidia gaming computer! 3x dual gpu graphics on a 3sli board running an nvidia cpu, and attached to a local nuclear reactor!3x dual gpu = 6x extra 75w power + 3x from the pcie socket = 675Wnvidia board tdp = dunno but I suppose 40W or sonvidia cpu = 400W probably, and sporting like 5 billion transistorsAfter all nvidia's way of making stuff faster is just by adding more of the same stuff.[/citation]

That's so true
 
Status
Not open for further replies.