Nvidia Announces Quarterly Results, Profits Dropped by 55%

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]dragonsqrrl[/nom]So let me get this straight... first, you're assuming that Nvidia has gk110 ready to go, and is simply delaying its launch in order to milk as much out of gk104's current high-end position for as long as possible. And because it would somehow be more beneficial for Nvidia to simply churn the rumor mill about a phantom high-end compute GPU than to actually develop one, gk110 therefore doesn't exist. I have to agree with Marcus52, your reasoning just doesn't make sense. There are so many reasons for Nvidia needing a GPU like gk110. There are also far more likely explanations than a conspiracy for why it hasn't been officially announced yet. For instance, the difficulty in manufacturing such a large and complex GPU on a new fabrication process. It probably simply isn't ready yet. There are also so many leaks and rumors circulating about this GPU, that I think most people would find it more difficult to not believe in its existence.[/citation]

You misunderstood me. Please read first sentence of my post: "I think there is no gk110" and also a line with "Nvidia is selling rumors about gk110."

I totally agree with you gk110 is not ready yet or there is no gk110 at all.

Also I'm sorry for my spelling and grammar but I'm not native English speaker
 
this kind of news is so , misleading .... note keyword PROFITS that means they made only half the profits that they made last quarter , , the end result they still made profits. keep in mind profit is any money you make above cost of operation, so 55% of what 10, 20 , maybe 30 million or more ... is still millions of dollars made this quarter. I wonder how many people they will lay off over this. that is usually what companies do , they consider it a loss if their profits don't grow every month. it's kind of rediculous and often used as an excuse to lay employees off.
 
I was willing to buy the best Kepler gForce, and I have the money, but nVidia gave the crippled 660, and named it the 680.

I'm keep waiting for the next "generation". That's the money nvidia will miss this year.
 
I highly doubt the expensive desktop cards keep Nvidia in business. That's probably why they are turning their focus to mobile and not desktops. PC sales are not where the growth is. Mobile market and Tablet market will be the main focus.
 
[citation][nom]tmk221[/nom]I think there is no gk110. It looks like releasing gtx680 as gk106 is marketing trick. It looks to me like Nvidia is doing paper lunches and selling rumors about powerful gk110 to convince as much people as possible to wait. They are trying to buy some time so that they can resolve supply issues without loosing customers. I just wonder why would they hold off a gk110 gpu that could literally kill AMD? Some say that there is no point..gk106 is powerful enough. But that's BS. GTX680 is great but is nothing more than what gtx580 was compared to 6970 and what gtx480 was compared to 5970...What I am trying to say is that it would be stupid not to use your advantages and let competitors catch up. Unless your competitors are 2 or 3 generations behind like AMD is to intel, but that's not a case in AMD-Nvidia head to head race[/citation]
the 5970 is a dual GPU card
 
Supply is nvidias problem, their products are good and their prices are competitive. As a user of both card manufacturers products i also have to say that nvidia also provides better drivers. They have quicker updates when new titles are released as well as ambient occlusion that add some depth in the picture.
 
[citation][nom]rantoc[/nom]Supply is nvidias problem, their products are good and their prices are competitive. As a user of both card manufacturers products i also have to say that nvidia also provides better drivers. They have quicker updates when new titles are released as well as ambient occlusion that add some depth in the picture.[/citation]
Not sure I would go that far. I did have to wait for over 9 months after Vista SP1(I waited until SP1 to try Vista) before Nvidia got a stable 8800GTX driver(The beta drivers would fix one thing and break another). At first i thought damn you Vista, but for the hell of it, I tried my old X1900XT and guess what, all problems went away(but I sure missed the 8800GTX performance).

Ambient Occlusion works on card from both companies just so you know. ATI just did not advertise it as a feature. It is kind of like saying ATI invented using multiple monitors as one large display(Eyefinity) for games, but for real, back in the day Nvidia/ATI had that then it(but with 2 screens only) just went away for a while.

Both companies have about the same quality when it comes to drivers(may not have been the same in the early days, but it is now days), just some issues seem to take longer to get fixed.
 
[citation][nom]tmk221[/nom]I think there is no gk110. It looks like releasing gtx680 as gk106 is marketing trick. It looks to me like Nvidia is doing paper lunches and selling rumors about powerful gk110 to convince as much people as possible to wait. They are trying to buy some time so that they can resolve supply issues without loosing customers. I just wonder why would they hold off a gk110 gpu that could literally kill AMD? Some say that there is no point..gk106 is powerful enough. But that's BS. GTX680 is great but is nothing more than what gtx580 was compared to 6970 and what gtx480 was compared to 5970...What I am trying to say is that it would be stupid not to use your advantages and let competitors catch up. Unless your competitors are 2 or 3 generations behind like AMD is to intel, but that's not a case in AMD-Nvidia head to head race[/citation]

There are so many things wrong with this post... First of all, GK110 is a second generation Kepler GPU, so it would be a GTX 7xx or GTX 8xx (if 7xx ends up being just an OEM series). Second, the GTX 680 has GK104, not GK 106. GK106 is in the GTX 660 or maybe the GTX 660 TI and one or two other cards below them. Third, the GTX 480 was beaten by the 5970. The 5970 was a dual GPU card and the 480 was a single GPU card. The 480 and 470 competed against the 5870, with the 480 a little ahead of the 5870.
 
[citation][nom]ojas[/nom]Well, the 680 was likely going to be released as a 670 Ti, i'm pretty much convinced of this now (after considering the evidence), which is also why the performance difference is so less b/w the two cards.There's a good chance they have the "true" 680 in the works, but they probably don't have it ready yet, and they wanted to look good at launch.Also, seeing that the 680/670 are based on GK104, i have a strong feeling that they'll re-release the 680 as the 760 Ti and the 670 will be the 760, just with the names GK114. I somehow doubt that GK110 will come out of hiding before the 780 is launched. Just like what they did with Fermi.[/citation]

The 680 and 670 are so close because of their VRAM bandwidth bottle-neck. The difference in GPU power is far greater than the difference in performance, so there is a memory bottle-neck. The same phenomenon can be demonstrated by taking Llano and overclocking the GPU. No matter how far you overclock that GPU, there is hardly any difference in performance unless you have greater than 2133MHz memory simply because of the memory bandwidth bottle-neck. Not enough bandwidth means that data can't be sent to and from the GPU fast enough for the GPU, so it's not being efficiently utilized.

For example, overclocking the 6550D of an A8 from 600MHz to 960MHz (a 60% overclock) only turns in a 15% or so performance improvement with 1600MHz dual channel memory (1866MHz is a little better, but not by much. Increasing the memory bandwidth increases the performance improvement from the overclock).

Also, with Fermi, they completely redesigned the GF104 when they made the GF114. The two chips have similar hardware, just redesigned. So, they are not a re-release, they are in fact two distinct chips. The GF100 and GF110 are more similar, but still not the same. The GF110 has some big power efficiency improvements that allowed it to use the full amount of hardware all while still using less power than the GF100 did.
 


Nvidia's the only company to have released drivers that kill the cards (196.75 or something like that). They don't have better drivers, their drivers simply have different problems. For example, I've had artifacts caused by a driver update to some of my Nvidia cards and the AMD/Ati cards had no such problem at the time. Other times, AMD/Ati drivers had such problems whereas the Nvidia drivers did not. Both companies have problems every now and then.
 
Its unfortunate that NV is bottle necked by production of its new GPUs. I can understand why they offered an olive branch to Intel for in fabs but NV burned that bridge long ago. Hope things turn up so we can keep getting amazing cards for our gaming! :)
 
1. there are too many amd fanboys on tomhardware.
2. their cards get hot. the reasons why I switched to amd for my current desktop. (I noticed amd gets hot too, but not as bad) My main laptop came with amd and it is the only option at the time.
3. for the future what hurts is all 3 consoles will use amd, as the PS4 will be amd also.

I probably might go back to nvidia next time I buy since I like CULD and use adobe stuff a lot so that are features can be enhance with nvidia.
 
Maybe they should make NEW cards with performance increase. Instead off everything +10% or something. Mean the difference between 590 and 690 is not worth 1000$. Though if it did, who the hell is buying videocards now? When xbox 720 and ps4 must be comming soon... I hope it, coz we're stuck on the graphics as it is due to consoles.
 
[citation][nom]USHypertraxx[/nom]Maybe they should make NEW cards with performance increase. Instead off everything +10% or something. Mean the difference between 590 and 690 is not worth 1000$. Though if it did, who the hell is buying videocards now? When xbox 720 and ps4 must be comming soon... I hope it, coz we're stuck on the graphics as it is due to consoles.[/citation]

The 690 is about twice as fast as the 590, which is generally about as fast as or a little faster than the 680, a $500 card. The 690 is also much more energy efficient than any other card, so if it's performance is what you're looking for, it saves you money compared to any other such setup.

Besides, with the next generation of consoles supposed to only have modified 6670 for graphics, they aren't a good enough upgrade over the current generation consoles to even say that they will change everything. PCs will still be years ahead of the consoles, so unless some game developers get off their asses and make a native PC game that is truly well coded and optimized for the platform, we will still be stuck with most games either being indie games (good games, but generally less than incredible graphics) or being console ports (sometimes decent graphics, but horribly inefficient coding and really, they have problems anyway, such as many just not being good games).
 
[citation][nom]xophaser[/nom]1. there are too many amd fanboys on tomhardware.2. their cards get hot. the reasons why I switched to amd for my current desktop. (I noticed amd gets hot too, but not as bad) My main laptop came with amd and it is the only option at the time.3. for the future what hurts is all 3 consoles will use amd, as the PS4 will be amd also.I probably might go back to nvidia next time I buy since I like CULD and use adobe stuff a lot so that are features can be enhance with nvidia.[/citation]

Photoshop also takes advantage of AMD cards through OpenCL. The 7950 or 7970 would probably speed that up more than any consumer Nvidia card would.
 
The Kepler cards came too late in the quarter to make a tangible difference, and the lack of availability also hurt sales. NVidia probably lost at least a good percentage of their sales as enthusiasts just couldn't wait 2 months and went for the 7970 instead.
 
Nvidia has baffling naming conventions for their GPU's gk104 is faster than their gk106 and their supposedly gk110 is faster than the gk104...
106
 
[citation][nom]slabbo[/nom]Nvidia has baffling naming conventions for their GPU's gk104 is faster than their gk106 and their supposedly gk110 is faster than the gk104...106[/citation]

GK100, not GK110. For there to be a 1 between the first 1 and the defining number (0, 4, 6, 7, etc) means it is a second generation chip for that architecture (IE, GF110 in the GTX 580 was a second generation Big Fermi chip with the GF100 in the GTX 480 being the first generation Big Fermi GPU). GK100 is the first generation Big Kepler GPU and GK110 is the second generation of it. Also, as the third number for Nvidia's GPU increases, the performance drops. It's not skewered, just in reverse. GK104 beats GK106, GK106 beats GK107, etc.
 
[citation][nom]tmk221[/nom]I think there is no gk110. It looks like releasing gtx680 as gk106 is marketing trick. It looks to me like Nvidia is doing paper lunches and selling rumors about powerful gk110 to convince as much people as possible to wait. They are trying to buy some time so that they can resolve supply issues without loosing customers. I just wonder why would they hold off a gk110 gpu that could literally kill AMD? Some say that there is no point..gk106 is powerful enough. But that's BS. GTX680 is great but is nothing more than what gtx580 was compared to 6970 and what gtx480 was compared to 5970...What I am trying to say is that it would be stupid not to use your advantages and let competitors catch up. Unless your competitors are 2 or 3 generations behind like AMD is to intel, but that's not a case in AMD-Nvidia head to head race[/citation]

That may not be entirely true; with rumors suggesting the 680 isn't the original Kepler that was supposed to be released. The current 680 is based off GK104 when originally it was supposed to be based off GK110.

If that turns out to be true; Nvidia essentially turned a cheaper mid-range card into a higher-end price point. Which means it may contain a far greater markup; greater profit.

However the mobile industry they've entered may be costing them a great deal of money. With Apple not using Nvidia's Mobile Unit and several Manufactures using their own Samsung's Exynos, Qualcomm Snapdragon. They just might not have the sales to make this market profitable. That'd be my guess.

There Quadro and GTX computer cards I'd imagine sell quite well. As the performance and drivers tend to really knock Ati/Amd back quite a bit. I think they got lucky with there 7900 Series cards; they never get to ride their top spot out nearly as long as they did with this series.
 
will lay off over this. that is usually what companies do , they consider it a loss if their profits don't grow every month. it's kind of rediculous and often used as an excuse to lay employees off.

In some cases that is very true. Most of the time; you work as a contractor on a yearly basis and then once your no longer needed or they can downsize the ability to maintain what you've setup. They lay you off; that is what happened to me when I worked at Apples Corporate office a few years back.

Really sucks when it happens...
 


That is a spam post that you replied to...
 


Sorry, one of those terrific blonde moments. After you pointed it out I was like "Whoops, my bad."
 


The profit for the higher end cards is fairly minimal; but the mid-range cards tend to have greater profit margins. So even at the lower price, they should be able to generate revenue. The desktop platform though I doubt is the cause of loss for profit. I'd say it's the Mobile platform.

The Tegra isn't in very many Android devices; plus the devices it is in aren't even among the top companies. Motorola and HTC tend to lean towards Qualcomm, Samsung is trying to truly utilize Exynos. Apple uses the ARM Cortex Series. Those companies tend to have the highest volume and sales.

I'll say this though. The Tegra is a wonderful unit; plus with that article that was released the other day about Kepler. It'll be interesting to see how far Nvidia can push it. They are essentially adding Intel's Hyper threading into the Graphic-Processing Unit. Which should open up Supercomputer and more mobile units in the market; even if it is simply licensing a patent to the technology. Way more efficient then Fermi; Fermi only does one workload per time. Where Kepler essentially task itself among multiple or single threads to accomplish it's workload.

Intel and Nvidia's partnership is starting to shine through.
 
Status
Not open for further replies.