Nvidia Tegra 3 Under Pressure After Tegra 2 Concerns

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]dark_knight33[/nom]UBS's who exactly? Just because *some* unknown analyst sitting behind a desk playing paper football makes a random prediction[/citation]
http://www.ubs.com/1/e/about/ourprofile.html
After you visit the link you can remove your foot from your mouth. UBS doesn't make random predictions and is a very well known name. Since you didn't seem to understand what was said in the article this is the short translation: Nvidia's revenue prediction was lowered due to the high amount of tablets in the market. A basic understanding of supply and demand is all that is required to understand why this would lower their revenue outlook.

This article didn't say anything negative regarding the Tegra 3 chip nor the Asus Transformer Prime that is to be released in a few days. This is purely from a business standpoint to help investors decide how to invest their money.
 
[citation][nom]getreal[/nom]If you are really so desperate to make stupid comments in hopes to gain favor, try badmouthing Apple; you will have much more success.[/citation]
It can play UT3engine for sure.
An Asus transformer with Tegra3 will replace my current Asus eee 1204N pc. But it would be nice to run Ubuntu on it.
 



Its such a strange thing to use as a measure of metric. Its a manipulation for your agenda. I could easily say AMD uses many more shaders than Nvidia therefore "that's not really a good sign of efficiency" as you put it or "cards are quite powerful but they're not exactly well rounded" They use 800 steam processors (5770) to match a mere 192 from Nvidia (550ti). I could go on and on to manipulate for an agenda. But its useless other than that. Why compare these total radically different architectures this way You calling 384bit a drawback is absolutely absurd. Its nothing to complain about, its all good! You compare the value, and although your all around opinion of Nvidia isnt high, they are doing much better than their competitor in the dGPU market they are currently selling in.

And to be clear, per individual sale Nvidia makes a huge profit off their tesla products. But its a much lower volume item. They make twice the revenue from their Gforce line. This is their current bread and butter. They are doing very very well here. They make much more money than AMDs GPU division. Why is this? Does AMD sell their cards at a much lower profit? Its clear Nvidia has higher margins, how do they manage this? You may not like their prices but they are extremely competitive. AMD lately is barely making money in their GPU division, and thats just a sad fact.

As far as lagging in the overall market? Nvidia is no longer in the chip-set business and therefor they have no igp. The overall market intel is the number 1 graphics because of their igp. This doesnt mean much of anything. Nvidia no longer sell igps, they only sell discrete (add-on) GPUs. In this market, Nvidia has actually gained share this quarter. They are nimber one here, in the segment they actually compete it.

So to sum it up. No matter what your opinion is, or how much you disslike Nvidia, They are outselling AMD while making more profits from each sale. Your distaste means little at the end of the day. With the truth at hand, it appears to me your force feeding a negative "opinion" is from a personal distaste based on emotion. Its unfounded in reality. You are entitled to your own opinion and thats fine i dont care to change that. I am only here to make clear of what is real and whats hog wash!
 
[citation][nom]getreal[/nom]If you are really so desperate to make stupid comments in hopes to gain favor, try badmouthing Apple; you will have much more success.[/citation]
Really? Let's try this then:
Apple really s*cks!!! Tegra3 rocks!!1!!one... There's no way apple can beat this tegra3 sh*t!!!
 
Nvidia's knight in shining armour is windows 8 running on arm processors. Couple that with a NVgaming zone as a distribution platform, they could start trying to make their own version of desktop PC's.
It'll come down to which is cheaper to manufacture.
Its not gonna be long until Mobile/tablet GPU's surpass current gen console hardware and if the same level of optimisation occurs for tegra devices that went on consoles you could get successful ports with awesome performance. Nvidia saw the squeeze from AMD and Intel and has responded. Only problem is mobile tech tends to be integrated totally meaning it has to be replaced in contrast to desktops which can be upgraded as new tech comes along. If that happens they move into low margin PC/laptop devices and possibly away from their core GPU buisness. I just cant see how tegra relates back to geforce/quatro as well as their GPU tech has helped them design tegra. Unless it allows them to integrate ARM into Geforce but that is no different to using an atom to power a GTX580
 

i read at's tegra 3 article before replying. it's a good read.
i manipulated my agenda, believe it or not, so did you. everyone does.
looks like you didn't understand why i used memory bandwitdh from my earlier comment, so i will elaborate on that...again. comparing memory bandwitdh helps me to compare two different products - ranging from two different gfx cards at the same price points to a gfx card to pcie bus or cpu memory bandwidth, strictly based on bandwidth numbers, only for the sake of comparison. a lot of architectural dissimilarities exist among computer components but memory bandwitdh is one of the few things they have in common, like gflops measures or gt/s measures. i compare because it's easy for me this way. and nvidia and amd do not have radically different architectures - they basically do the same things a bit different way, so i needed some ways to compare them.
as i mentioned earlier, i do not compare shaders among gfx cards, similarly i don't compare rops, core clocks etc because amd and nvidia implement them in different ways and define them differently (afaik).
never called using 384-bit memory bus width a drawback. i said since amd performs comparably using a smaller bus width, may be nvidia's is less efficient.
for example (only for example): radeon hd 6990 has 2x 256 bit bus and 2x 160 gbps bandwidth, a gtx 590 has 2x 384 bit bus and 2x 163.87 gbps bw. iirc the 590 uses more power than 6990 on load. there are other factors in play like how nvidia adjust their cards' performance by tweaking different clock rates and other components but in the end the much higher bit width doesn't really translate into a much better performance and power consumption.
take a look at this guru3d page. here, a 550ti takes more power than an hd 6850 which is a better gfx card than 550 ti and the 6770 performs comparably to 550 ti using even less power. i linked this as an example of nvidia's less power efficiency at a certain level. but if you look at the fps figures you'll see that nvidia cards deliver a lot of fps compared to amd's similar cards and they have the fastest single gpu card in the market today. despite that amd's cards continue to deliver better value/frame
i don't value nvidia any less than i value amd/ati. i've used both amd and nvidia products and am aware of their advantages and disadvantages. and tesla... profit margin...what? this article is about tegra 3 and how oems are being cautious about using it on their own products. i could only discuss so far without being so much off topic.
tegra 3(formerly kal-el) seems to be a better product on paper but so far only asus has announced an existing product based on the soc. nvidia's decision to not raise l2 cache incrementally and not increasing memory bandwidth(see how that comes into play?) might come back to bite nvidia in the ass.
you're quite wrong about my distaste about nvidia, i only criticise where i think it's worth criticising based on the verified facts.
 
[citation][nom]de5_roy[/nom]tesla... profit margin...what? [/citation]

GPUs.... memory bandwidth...What?

This article was about tegra, you started off with all kinds of stuff unrelated, i apologize for continuing. Really none of anything you said is of any consequence. If you dont already know, Nvidia again outperformed all analyst predictions. Quarter after quarter they are shining bright all while the nay sayers keep on with their grim wishes. Maybe you should rethink some on Nvidias "negitives" cause they sure know what they are doing. AMD on the otherhand struggled to get any profits in their GPU divisions. Nvidia is doing very well and making lots of money, wow imagine that. Whatever they are doing must be right cause they are doing very very well in a not so great economy.

Your first post in this article was BS, and proven to be. The parts about about how nvidia screwed up with fermi and now AMD rules in the most important, highest profit segment? Bogus, nvidia made tons more money that AMD in their GPU segments this quarter which completely proves your dead wrong. Whatever beef you got with the 384bit means nothing, its part of nvidias winning formula. Nivida is not only competitive, they know how to make great cards and even greater profits.

Nvidia is in great shape.
 

lol, i guess you could take my comments about calling nvidia's higher bit discreet gfx cards less efficient (providing best performance for money) than amd's way out of context and call them unrelated. then again you can do that with pretty much anything anyone says. in reality they were related to the last paragraph (well, since you seem to be quite capable of missing the point) about nvidia losing overall gfx market share to both intel and amd.
nvidia did beat the analysts' prediction and made lots of profit. i wonder if the magnitude of nvidia's success seemed amplified because of analysts lowering their forcasts in the first place 😛 . not that i am calling nvidia's achievements false (this is me trying to avoid another nvidia fanboy outcry :) ).
still, i wouldn't go so far as to say they're shining bright in the darkwhatever and stuff. the unsold tegra 2 devices will most likely impact tegra 3's adoption unless the unsold tegra 2 devices clear space for tegra 3. none of the non-apple tablet makers have been able to replicate ipad's success so far. samsung galaxy tab seems to be a worthy contender but iirc it faced bans and didn't sell as much as ipads.
amd does rule entry level gfx segment - with their llano apus and 54xx, 55xx, 56xx, and the newer 64xx, 65xx, 66xx gpus. they have 57xx and 67xx gpus in the lower mid range gpus. nvidia had gt220, 240 and gt 430 against amd's cards but they don't have anything competitive (price-performance wise) from entry level 4xx or 5xx line. besides, nvidia's lack of an x86 cpu accelerated their exit from igp market (both intel and amd don't have that weakness), may be they can be competitive with arm when win 8 ships.
nvidia did screw up fermi launch with low yield, high power consumption, loud noise problems. while it is a promising technology, it's execution was and still imo is, flawed. they turned that around with the 5xx cards. i hope they don't repeat that with kepler. i didn't see any facts or figures or credible reviews that prove my comments wrong, i saw only enthusiastic, overzealous (slightly baseless concerning 384 bit cards being more efficient than amd's 256 bit ones 😀 ) replies. i believe it's good to be excited for something you love, just not so much so that you become blind to facts. :)
neither nvidia nor amd are my best friends - they're giant corporations that sell products i buy and use. i am just posting comments and sharing my opinions in a forum. it doesn't mean much in the end, just me passing time.
 
Status
Not open for further replies.