Nvidia's GPU Technology Conference Liveblog 10:30p/1:30e

Status
Not open for further replies.

EzioAs

Distinguished


Good one. They'll have to answer that on the conference
 

andrewfy

Distinguished
Nov 9, 2010
14
0
18,510
GPUs are basically just processors with wide SIMD - Nvidia's use CUDA which is easier to program in some ways (and harder in others) than the assembly-style intrinsics that SIMD usually has to be programmed in. The Daily Circuit did some analysis on this back in November -
http://www.dailycircuitry.com/2011/11/128-bit-simd-is-dead-long-live-128-bit.html
- the SIMD in CPUs stagnated for a long time at 128 bit but recently extended to 256-bit with SIMD and now 128-bit is really dead. The SIMD in GPUs was recently 512-bits wide (now 1024-bits wide), so the width advantage of GPUs is sometimes as low as 2x depending on the product cycle - and this is where most of the advantage of GPUs comes from (along with memory bandwidth to keep the SIMD units fed).
 

DRosencraft

Distinguished
Aug 26, 2011
743
0
19,010


Simple answer; most people don't need it, and those who do will buy a professional series GPU that doesn't need all the other stuff in a consumer card. They will likely be announcing new Quadros. I know a lot of people who work in 3D applications and in graphic design who are very interested to see what a new series of Quadro cards can do.
 
[citation][nom]DRosencraft[/nom]Simple answer; most people don't need it, and those who do will buy a professional series GPU that doesn't need all the other stuff in a consumer card. They will likely be announcing new Quadros. I know a lot of people who work in 3D applications and in graphic design who are very interested to see what a new series of Quadro cards can do.[/citation]
While this is true, and most professionals ought to be on Quadros instead of GeForce cards, there is still a market for people like me who opted for a relatively cheap $380 570 (on sale when I got it, currently $285), instead of the much more expensive $750 Quadro (again, at the time, while this card now goes for $430). I am on that boarder between 'extreme hobiest' and 'entry level professional' where I do production work on my machine to the level where it is helpful to have the realtime CUDA features while editing, but cannot simply spend money on the parts that I want (besides, I do a fair amount of gaming and such as well which is better on the GeForce side). I am not complaining a whole lot as the 570 meets my needs at the moment, but next time I upgrade I would like to know that there is an in-between card that is still consumer focused, but has a few pro features.
 
@Rosencraft
Put in annother way: What about a product for the growing number of video reviews on youtube and other such sites? They do a fair bit more editing than I do, yet as most of them are unpaid (or not well paid when getting started) something like a 570 with both game and editing features would be most helpful, and open up a lot more options for them without having to spend an arm and a leg. Once they get going and start raking in a bit of money then absolutely; Quadro is the way to go. But to start out on a budget the 570 was an excellent option at the time, and it does not look like it will be replaced on the nVidia side (though it looks like AMD is picking up the slack).
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
Why try to sell something when you don't even have a product? I have been waiting for these cards for a long time now, and yet everywhere still out of stock....
 
[citation][nom]caedenv[/nom]@RosencraftPut in annother way: What about a product for the growing number of video reviews on youtube and other such sites? They do a fair bit more editing than I do, yet as most of them are unpaid (or not well paid when getting started) something like a 570 with both game and editing features would be most helpful, and open up a lot more options for them without having to spend an arm and a leg. Once they get going and start raking in a bit of money then absolutely; Quadro is the way to go. But to start out on a budget the 570 was an excellent option at the time, and it does not look like it will be replaced on the nVidia side (though it looks like AMD is picking up the slack).[/citation]

If you can do it with OpenCL instead of CUDA, then AMD has excellent offers for that market in the Tahiti based cards.
 

dennisburke

Distinguished
May 12, 2008
100
0
18,680
I was able to watch the live keynote at Nvidia and all I have to say is wow. I'm still happy with my Fermi for my PC needs at the moment, but if this cloud gaming takes off, I'll probably put my old 8600GT back in my computer. Not sure if Nvidia is shooting themselves in the foot here. Other than that, I have to admire Nvidia's vision for the future. Wish I would have bought stock three years ago.
 

vitornob

Distinguished
Jun 15, 2008
988
1
19,060
[citation][nom]computernerdforlife[/nom]Diablo 3 hardcores will never read this article today. Guess where they're be living for the next few days/weeks/months/years?[/citation]

I'm a Diablo 3 hardcore and I'm reading this! (of course I'll not comment about the server maintenance going through the next half hour...)

:)
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
hmmm... and to think I got thumbed down just a couple days ago for even suggesting that Nvidia might officially announce gk110 at their GPU Tech Conference. I mean seriously right? Nvidia announcing their Kepler derived compute oriented GPU at a conference targeted specifically at GPGPU computing? How ridiculous.

http://www.tomshardware.com/forum/15606-55-nvidia-announces-quarterly-results-profits-dropped

... and the guy who claimed that "there is no gk110" got thumbed up 19+. Some of the users in the Tom's Hardware community just never cease to amaze me. But I just have to ask, where are you guys now?

http://www.nvidia.com/content/PDF/kepler/NV_DS_Tesla_KCompute_Arch_May_2012_LR.pdf
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]caedenv[/nom]If they are such firm believers in these other uses then why did they take out a lot of that functionality from the new Kepler series?[/citation]
As DRosencraft already suggested, Nvidia didn't remove compute functionality from Kepler, in fact they've expanded it. It's all still there, and the Kepler architecture is designed for compute performance and efficiency. Some compute functionality is just severely limited in gk104.
 
[citation][nom]dragonsqrrl[/nom]hmmm... and to think I got thumbed down just a couple days ago for even suggesting that Nvidia might officially announce gk110 at their GPU Tech Conference. I mean seriously right? Nvidia announcing their Kepler derived compute oriented GPU at a conference targeted specifically at GPGPU computing? How ridiculous.http://www.tomshardware.com/forum/ [...] ts-dropped... and the guy who claimed that "there is no gk110" got thumbed up 19+. Some of the users in the Tom's Hardware community just never cease to amaze me. But I just have to ask, where are you guys now?http://www.nvidia.com/content/PDF/ [...] 012_LR.pdf[/citation]

I think there is no gk110. It looks like releasing gtx680 as gk106 is marketing trick. It looks to me like Nvidia is doing paper lunches and selling rumors about powerful gk110 to convince as much people as possible to wait. They are trying to buy some time so that they can resolve supply issues without loosing customers. I just wonder why would they hold off a gk110 gpu that could literally kill AMD? Some say that there is no point..gk106 is powerful enough. But that's BS. GTX680 is great but is nothing more than what gtx580 was compared to 6970 and what gtx480 was compared to 5970...

What I am trying to say is that it would be stupid not to use your advantages and let competitors catch up. Unless your competitors are 2 or 3 generations behind like AMD is to intel, but that's not a case in AMD-Nvidia head to head race

That guy clearly said he thinks that there is no GK110, not that he says that there will not be a GK110. Beyond that, he is clearly talking about there not being a consumer card with GK110, IE no GTX 680 TI, 685, or 685 TI with it, not about there not being a compute card based on it. You sure are quick to criticize someone when what they said is not even in the same context as what you're talking about! Besides, he was wrong about pretty much everything else in his comment, I don't understand why he was voted up like this...

For example, the difference between the GTX 680 and the Radeon 7970 (at 4MP resolutions, the resolutions where you would most likely use these cards) is much smaller than the difference between the GTX 580 and the Radeon 6970 at any resolution. The GTX 680 uses GK104, not GK106. The GTX 480 was slower than the Radeon 5970 and the two were not directly comparable because the 5970 is a dual GPU card, whereas the 480 is a single GPU card.

It's funny that he was voted up, despite there being at least three or four people who called him out on his many mistakes. Regardless, he was still not even talking about the compute market, just the consumer gaming market, so you were still wrong about what he said.
 
[citation][nom]dragonsqrrl[/nom]As DRosencraft already suggested, Nvidia didn't remove compute functionality from Kepler, in fact they've expanded it. It's all still there, and the Kepler architecture is designed for compute performance and efficiency. Some compute functionality is just severely limited in gk104.[/citation]

Nvidia took the GK104 equipped GTX 680 and GTX 670 and despite them being more than 50% faster than the GTX 580, they are only a little more than half as fast as the 580 for DP compute. The Kepler cores used in the consumer GPUs aren't even capable of DP compute, just SP. The consumer GPUs have DP functionality at all because they have a few of the DP cores that only do DP math, so they don't help the gaming performance or other SP performance at all.

The architecture for the consumer Kepler cores is not designed well for compute. GCN beats it greatly. For example, the 7970 is about 50% faster for SP math than the 680. The 7970 is almost six times faster than the 680 for DP performance, about three times faster than the 580 for DP performance. GCN is designed for compute and does it better too. I can't say the same for the compute oriented version of Kepler because I have yet to see benchmarks for it, but it's obviously better at compute than the consumer version. Regardless, to say that Kepler is good for compute when GCN beats it so badly just seems wrong. We'll have to see how the Pro versions of Kepler and GCN do against each other, but I have to say, I'm not seeing Kepler beat GCN.

Nvidia will need to beat AMD with the software instead of the hardware, and really, that's not unlikely, but AMD still seems to have the better hardware, at least with what little info we have now.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]blazorthon[/nom]Regardless, he was still not even talking about the compute market, just the consumer gaming market, so you were still wrong about what he said.[/citation]
I'm sorry but I have to disagree. I don't think that's what he was suggesting. Yes, he does speak within the context of gaming, but not because he thinks a gk110 based gaming card, in particular, doesn't exist. He implied that there probably isn't a gk110 (he makes no mention of a GTX680TI or GTX685), and questions the existence the GPU itself because "gk106" (...) already provides adequate high-end performance in games. Basically, Nvidia doesn't have a gaming oriented incentive to produce a higher-end Kepler derivative, so why would they? He even manages to toss in a conspiracy theory for good measure, explaining the very existence of the gk110 rumors.

And I'm not sure I understand the distinction in your first sentence. At least to me, they both seem to imply the same thing. Although I suppose you could make the argument that because gk110 is still in development, it therefore does not yet exist. But again, I don't think that's what he was trying to say.
 
[citation][nom]dragonsqrrl[/nom]I'm sorry but I have to disagree. I don't think that's what he was suggesting. He implied that there probably isn't a gk110 (he makes no mention of a GTX680TI or GTX685), and questions the existence the GPU itself because "gk106" (...) already provides adequate high-end performance in games. Basically, Nvidia doesn't have a gaming oriented incentive to produce a higher-end Kepler derivative, so why would they? He even tosses in a conspiracy theory explaining the gk110 rumors. And I'm not sure I understand the distinction in your first sentence. At least to me, they both seem to imply the same thing. Although I suppose you could make the argument that because gk110 is still in development, it therefore does not yet exist. But again, I don't think that's what he was trying to say.[/citation]

Everything that he said was in the context of gaming cards. Every card mentioned in his post is a gaming card and the comparisons were all in gaming performance. Disagree if you want, but everything he said had everything to do with gaming and nothing to do with compute. Beyond that, saying that he thinks there will be no GK110 is very different from saying that there will be no GK110. He says that he thinks there won't be one, but by saying that he thinks there won't be one instead of that there will be one, he admits that it is possible. He basically said that it is unlikely, not that it is impossible.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290

oh good, just revised my comment...
 


I see your point, but I still think that he was only referring to gaming. Remember, a lot of gamers don't even realize that there are GPUs used for purposes other than gaming, let alone that they are often similar or more or less the same GPUs used by some consumer cards. Considering the many things wrong with his post, I'm going to give him the benefit of the doubt on this one and assume that he didn't even realize that his post could be interpreted as him saying that there won't be GK110s used in compute cards. See what I'm getting at?
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]blazorthon[/nom]Nvidia took the GK104 equipped GTX 680 and GTX 670 and despite them being more than 50% faster than the GTX 580, they are only a little more than half as fast as the 580 for DP compute. The Kepler cores used in the consumer GPUs aren't even capable of DP compute, just SP. The consumer GPUs have DP functionality at all because they have a few of the DP cores that only do DP math, so they don't help the gaming performance or other SP performance at all.The architecture for the consumer Kepler cores is not designed well for compute. GCN beats it greatly. For example, the 7970 is about 50% faster for SP math than the 680. The 7970 is almost six times faster than the 680 for DP performance, about three times faster than the 580 for DP performance. GCN is designed for compute and does it better too. I can't say the same for the compute oriented version of Kepler because I have yet to see benchmarks for it, but it's obviously better at compute than the consumer version. Regardless, to say that Kepler is good for compute when GCN beats it so badly just seems wrong. We'll have to see how the Pro versions of Kepler and GCN do against each other, but I have to say, I'm not seeing Kepler beat GCN.Nvidia will need to beat AMD with the software instead of the hardware, and really, that's not unlikely, but AMD still seems to have the better hardware, at least with what little info we have now.[/citation]
Wow... we really like to argue semantics. True, true, and true, but again I don't think that's what he was saying. Unless I'm mistaken, "Kepler" doesn't refer to any specific GPU in the lineup. Both gk110 and gk104 are derivatives of the Kepler architecture. I don't think that simply saying, a lot of compute functionality was removed from the "Kepler series", is accurate.
 
Status
Not open for further replies.