Nvidia CEO On Intel's GPU, AMD Partnership, And Raja Koduri

Status
Not open for further replies.

arielmansur

Reputable
Apr 27, 2015
16
0
4,510
It's not bad to keep calling them GPUs, it's just that the G doesn't stand for Graphics anymore, more like General, according to the payload.
 


You are right. I remember buying Matrox video cards for my company up until 10 years ago. They were quite competitive with nVidia and ATi at the time.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I agree with his statement about how the "G" is misleading at this point. I also agree that these processors have evolved to unprecedented complexity compared CPU's. Let's face it, Intel or AMD cpu's haven't truly advanced nearly as fast as cards for the last 10 years-moore's law in full effect.

I think if Intel really manages to produce something with quality performance and costs it should be a win/win for us the consumers. NV's dominance is kind of scary(even though I love my TI).
 

termathor

Distinguished
Jan 16, 2015
75
1
18,645
"I'm probably showing my age, but i do remember companies such as 3DFX, S3, Matrox and even Intel that all produced cards less than 20 years ago. http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-9.html"

3DFX was acquired by Nvidia early 2000s I think. Probably it was Nvidia's foundation in terms of IP....
Matrox seems to still be in business, though.
S3 ? Was it not a 3DFX product ....
 
Huang prominently repeated the statement that "We are a one architecture company."

While that has nice sounding implications, it does mean a few other things:
* We are almost a one-size-fits-all company. If our tech doesn't work for you, you should reconsider what you're doing.
* We are in deep trouble if something else comes out and it can beat everything we have in the pipeline for the next 5 years, and what we have in the early stages for years beyond.


More than one architecture can mean the following:
* If one tech becomes obsolete overnight, we can weather the change easier.
* If one tech doesn't fit your application, another one might.
* We can have various techs more focused on a specific task instead of trying to be a jack-of-all-trades.


{...} stating, "We support our software for as long as we shall live." {...} Huang also pressed the point that investing in five different architectures dilutes focus and makes it impossible to support them forever, which has long-term implications for customers.
And as pointed out earlier.... This is a bit of a lie... unless you decide to redefine this as software only and the totally underlying hardware is excluded.... Otherwise they best be still supporting their old Riva series, which we know they are not. (It's still good enough for basic desktop use for offices and such where there is no heavy graphics or any compute needs.)
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010


Maxwell is hardware, not software chief. DirectX is not Nvidia's software.

This entire article is about non-graphical workloads for Nvidia's hardware and the software ecosystem Nvidia has developed to utilize it. Saying they will support the software indefinitely doesn't imply hardware you bought 5 years ago will work with whatever version of Windows exists in 2030.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


Oops, glaring omission on my part. Thanks for the heads up! fixed.
 


The last time I checked, Nvidia still has Windows 10 driver releases for Nvidia GPUs dating back to the GeForce 400 series (388.13 released on 10/30/2017). Now whether or not they make enhancements to those older GPUs for the latest games is another matter. But it's a moot point anyway as many several generation older GPUs - especially what were considered mid-range at the time - don't even meet minimum gaming specs in today's AAA titles.

But besides that, I know of no serious PC/gaming enthusiast who keeps a GPU for more than three years at the most in a primary gaming system. The exception being upgrading to a second GPU for SLI or Crossfire to keep older GPUs still alive in the latest games (and then there's the decreased multiple GPU support trend in games). The four and a half year old GTX 770? It's now only as fast as a 1050 Ti yet uses three times the power. Same with a four year old R9 280X.

Faster performance, lower buying cost in future generations, more VRAM - always on the increase in game demands, and lower power consumption all eventually out date previous generation GPUs whether software drivers and game update enhancements are kept up to date or not.
 

andrewsmagnoni

Prominent
Nov 10, 2017
2
0
510
Intel and Microsoft are famous for abandoning technology.
If the revenue is not good they abandon all the user base and creates another.
Apple and Nvidia think before doing something.
The sdk in the macos it's the same from the first version of X. And the sdk from Nvidia is the same too.
Developers are always migrating sdk... The only thing Intel didn't abandon is the x86 architecture.
 

Mathias_8

Commendable
Apr 8, 2016
58
0
1,640
The term GPU was first introduced for graphics chips with Transform and Lighting-capability (for polygons/3D, until then the CPU's job), and later Vertext Shaders, until finally simply "Shaders" (which do all kinds of stuff). Nvidia's Geforce was the first GPU, until then graphics cards did basically just textures, filtering, lighting (I think) and AA.
 

andrewsmagnoni

Prominent
Nov 10, 2017
2
0
510
They will abandon this sector in at least 5 years.
Intel and Microsoft are famous for abandoning.
They didn't think before doing something.
Apple and Nvidia are still using the same sdk updated from the first version of the new architecture.
I forgot how many times the developers needed to migrate to another sdk to developing for Windows or a division in the Intel was closed.
 


I have no idea what you are talking about. I've been a PC builder with Intel/Microsoft builds for 20 years. I can only recall one time where I got abandoned in a build by Intel: when I went with a RAMBUS memory Pentium IV build. In a matter of six months, they introduced DDR memory support with a new chipset and I was left behind for a memory upgrade path.

And as one who has had every Windows OS starting with 3.11 buying a Compaq 486sx PC in 1992, I have never felt "abandoned" by them. Of course, I'm one of those who expects software to advance and accept the fact that an 8 year old OS is outdated.
 

tedstoy

Prominent
Nov 10, 2017
1
0
510
Softbank owns ARM and a large chunck of Nviada. Both ditect competitors to AMD and Intel. Is it any wonder the 2 x86 makers are getting togather to fight a common competitor.
 

cinergy

Distinguished
May 22, 2009
251
0
18,780
That douche is just advertising his own GPUs whenever gets the change. Hey Mr Huang, I won't be buying your chips, ever. I'd rather get Intel's GPUs if AMD dies.
 


Examples of "bashing" please. AMD fanboys accuse Anandtech of doing that also when they see a review of an AMD product that was less than sky high jumped up and down for in praise :pfff:. My defintion of "bashing" does not mean false praise or ignoring mentioning negatives to keep from offending the fanboys of AMD, Nvidia, or Intel for that matter.

The fact you don't like a fair and balanced review and are blinded by fanboyism wanting others to think as you do is not what a professional website review is about. We expect honest impartial judgement on a new product, and we get it. Don't like it? Then you can always start your own website and do your own AMD biased and Intel and Nvidia bashing reviews.

I seem to recall both websites singing praises for Ryzen and RX 56. On the other hand, both websites were less enthused about AMD's RX 64 GPU (and rightfully so). <---that triggered the fanboys
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865


S3 had no relation to 3DFx. S3 was bought out by VIA and later became known as SonicBlue before becoming S3 again. S3 is known for the Savage and Virge lines of graphics processors and more recently, the DeltaChrome and Chrome series graphics cards. Last thing they were doing was enterprise embedded graphics chips.
 

bit_user

Polypheme
Ambassador
they are the most complex processors built by anybody on the planet today.
This is nonsense. GPUs consist of replicating large numbers of identical, fairly simple computing elements. Sure, there's a lot of clever tweaks in how those elements are designed and the bits of hardware that tie them all together, but making a fast x86 processor is still much more challenging and requires more unique design elements.


It's precisely because process-driven gains have petered out that x86 CPUs have had to become ever more sophisticated to eek out the meager gains we've seen. Some things x86 CPUs do that GPUs don't:

  • ■ out-of-order execution
    ■ x86 -> micro-op translation
    ■ branch prediction
    ■ speculative execution
    ■ register alias analysis & renaming
    ■ multiple privilege levels
    ■ memory encryption
    ■ instruction-accurate traps
    ■ full backwards compatibility, in hardware
Intel CPUs also do optimization at the micro-op level and lock optimization that can even roll back the architecture state of the CPU in the case of lock collision.

Meanwhile, GPUs have (mostly) just had to worry about how to efficiently scale up to ever greater numbers of execution units. More recently, they're also having to worry about power efficiency, due to ever greater leakage with smaller process nodes, and here Nvidia seems to have done a better job than AMD.
 
Status
Not open for further replies.