Nvidia Lowers Q4 Revenue Guidance, Stock Drops

Brian_R170

Honorable
Jun 24, 2014
287
2
10,785
0
First, Intel announces lower-than-expected Q1 guidance, now Nvidia lowers Q4 guidance (which overlaps with Intel and AMD Q1). This doesn't bode well for AMD's Q1 guidance tomorrow.
 

redgarl

Distinguished
Jun 4, 2009
2,750
16
20,965
82


Yeah, you are right, they are not going to release a revolutionary platform on both CPU and GPU this year... >:/

Q1 is a given, it is not going to be anything particular, but the guidance for the year is not going to be bad unlike Intel an Nvidia.

 

redgarl

Distinguished
Jun 4, 2009
2,750
16
20,965
82
That letter from Jensen... what a damage control spin. Basically Nvidia is only selling GPU... and this is the reason why their revenue tanked.... because Turing tanked.
 

InvalidError

Titan
Moderator

Intel missed targets because 10nm is on its way to being three years late to market and Intel can't keep up with orders anymore, which is good news for AMD which doesn't have that problem. GPU-wise, it is going to be a bad couple of quarters for all GPU designers thanks to the crypto-crash making used 1070/1070Ti go for $150-200.
 

jimmysmitty

Polypheme
Moderator


Revolutionary? Nah. There wont be a revolutionary CPU for a while.

And no real info on Navi yet to make any sort of judgement call. While the pricing was bad I would say Turing is revolutionary as its the first GPU with hardware based ray tracing.



But they didn't. The GTX 1080 FE was $699. The RTX 2080 was $799. $100 bucks more. Retailers price gouged like they always do causing it to cost more than it should.
 

redgarl

Distinguished
Jun 4, 2009
2,750
16
20,965
82


Still drinking the Nvidia coolaid? You probably wrote that review then...

https://wccftech.com/review/msi-geforce-rtx-2080-ti-lightning-z-11-gb-graphics-card-review-the-card-that-goes-shazam/

Zen 2 is revolutionary because nobody ever did something similar before. The uarch is totally unique. As for Navi, the goal is creating a GPU able to render 4k@60Hz for the consoles. We know that PS5 is going to be around 500-600$ giving us a good look at what Navi is going to cost, which is at max half of the console price.

Turing is not a revolution, it is an abomination. A typical price gouging scam that Nvidia is particularly fond of. The link I posted is showing a 1600$ 2080 TI review scoring a 10/10 for Value. There is no excuse for Turing, it is a rip-off and customers agreed by not buying it.

RT and DLSS are nothing but Gameworks option that are hardware accelerated. Real Ray Tracing apps are still using raw compute power to execute and can be achieve with any GPU. The level of Ray Tracing is so gimped for making games able to run it that it makes the rendering barely noticeable.

3D artist are developing games for consoles. As long as AMD is having that market, Nvidia will not be able to implement RT or DLSS.
 
It seems like more are cheering Nvidia's losses then their new Turing Tech. I think we can say so far the 2000 series is a bust. The pricing is way more than people expected and the hyped tech is too costly for the little it adds to the games appearance if there was games that used it which there is really none. Just one that was modified to support Turing and showed that the tech is not ready, and adds little to nothing, so it is just not needed. Yeah, "just buy it" doesn't seem to funny these days.
 

InvalidError

Titan
Moderator

Nothing really unique or new, most of the Zen 2's core architecture is still the same as Zen/Zen+ with a few tweaks, the main difference is spinning IO/RAM into its own separate die which is a lot like how it used to be 10+ years ago when the memory controller and VESA/AGP/PCIe bus were fed from the north bridge on the motherboard, except the "north bridge" is on the CPU substrate this time around to keep CPU-IO/RAM latency on the multi-die setup as low as possible along with much higher bandwidth to minimize performance penalties from going back to segregated functions and multi-die CPUs. When CPUs integrated memory controllers and eliminated the worst FSB bottleneck, that provided a 40-60% performance increase from halving total memory latency.

Never forget that at the end of the day, Zen 2 is a cost optimization exercise, not a pursuit of absolute highest performance. For mainstream, AMD is sacrificing the lower latency of a monolithic die for the lower total manufacturing cost, higher yield and process flexibility of MCM. TR3/EPYC2 however should gain performance and ease of programming from eliminating NUMA concerns on 1S systems on top of previously mentioned cost-cutting.
 

InvalidError

Titan
Moderator

Short of going from conventional CMOS to spin/quantum/whatever, I wouldn't expect anything I would call a "revolution" any time soon either and whatever the next big general-purpose compute thing might be if it ever happens, chances are it will ultimately follow the same general architectural principles as modern RISC. An optical CPU would still need instructions, possibly instruction decoding, instruction cache, instruction scheduling, data load/store, data cache, CPU registers, ALUs, clocks of some sort to coordinate data movements, etc. Even quantum computers will likely need to follow similar patterns to run sequential stuff that all non-quantum algorithms are made of.
 

TJ Hooker

Illustrious
Herald

First off, Sony is just buying APU dies from AMD, not full discrete graphics cards, so I don't think the prices are directly comparable (you can't even really separate the cost of the GPU and CPU for consoles). More importantly, they could easily design the console GPU to have a lower CU count compared to the full fledged desktop card to lower the cost for consoles.

As far as the next gen consoles doing consistent (real) 4K/60 FPS, without any major shortcuts or sacrifices to visual fidelity compared to normal PC 4K gaming, I'll believe it when I see it. IIRC the PS4 Pro and XBox OneX were both hyped as 4K gaming machines, but then ended up using various tricks and sacrifices to get pseudo-4K like checkerbox rendering and upscaling and whatnot.

I'm not saying that half the price of a PS5 (so $200-$300 based on the numbers you gave) isn't a realistic guess for Navi, but that's more because that's about the price range of mainstream graphics cards. Which is what Navi is purportedly targeting IIRC.
 
Feb 14, 2018
7
0
10
0
Looks like trumps trickle down economics are really benefiting everyone... LOL. Companies are doing shit this quarter because of poor economic fundamentals from the orange turd in charge
 

mellis

Distinguished
Jun 17, 2011
24
0
18,510
0
Their Ray Tracing tech is a joke after being bench marked on an high end RTX 2080. With this failure, I think they are now going to come out with GPUs minus the components necessary for Ray Tracing to keep AMD at bay with their up coming GPUs.
 
Jan 28, 2019
1
0
10
0
Adding Ray Tracing is like adding sesame seeds to my lunch. It's nice to have but not essential. Therefore you CANNOT charge a whole new lunch for this add-on!!!
 

bigdragon

Distinguished
Oct 19, 2011
515
17
18,985
0
I think Nvidia is in for a world of hurt this year. Same thing is coming to AMD's graphics division. Graphics card prices are outrageous right now. The RTX cards are the worst. Nvidia and AMD seem to think they're competing against each other. They don't seem to realize that they're also competing with the game consoles! Why buy a super-expensive graphics component when you can buy an entire gaming system for half as much if not cheaper?

Nvidia's problem is that graphics-pushing games are mostly coming from the AAA industry. Most of the games are multi-platform, anti-modding, anti-user content, and tied to online services. Worse, most games are ports to the PC with poor optimization and limited PC-centric features (Anthem seriously doesn't have an FOV slider in 2019, among many other PC version defects). Most of the advantages PC gaming had over consoles are gone from mainstream gaming.

Nvidia needs to take a look outside its bubble and realize they've priced themselves out of the market. I prefer to game on PC, but I'd buy a console before I'd purchase an RTX card.
 

InvalidError

Titan
Moderator

The last couple of times I have seen AMD and Nvidia market segment breakdowns, big data, high performance computing, artificial intelligence, etc. were by far the largest growth markets for GPU sales after crypto. I bet part of Nvidia's pricing at the high-end (aside from having an effective monopoly) is due to the opportunity cost of offering a consumer derivative of their $3000-10000 Quadro GPUs.
 

marcelo_vidal

Prominent
Jan 5, 2018
25
0
530
0
Still usind a westmere-EP L5630 x2 and 32gb ecc ram 10us + 30us ram a new motherboard from supermicro 101us for the system and the people pay this for the ram... lol why I will pay 1000us for one graphic card ??? the people are nuts at non days!
 

kinggremlin

Distinguished
Jul 14, 2009
571
37
19,010
0


The 1080Ti was a tremendously popular card for Nvidia. The only performance upgrade for all these users was the 2080Ti which pushed the price from $700+ to $1200+. $1200 is way out of the price range for the mainstream enthusiasts. Nvidia needed to keep the price of the 2080Ti under $1000 if they wanted any real sales volume. Turing really needs a die shrink. The chips are too large. I would expect Nvidia to do much better on pricing for their next release on 7nm and legitimately bring ray tracing to the mainstream.

As for the 2080, the majority of people looking for a $700-800 card with 1080ti level performance, had already purchased a 1080Ti. So the upgrade market for that card was pretty limited as well.
 

TJ Hooker

Illustrious
Herald

GTX 1080 dropped to $599 when the 1080 Ti came out (for $699). Given that the 2080 Ti is already out, the MSRP difference from 1080 to 2080 is effectively $200. Still, far from double, although that was obviously hyperbole.
 
How long (if ever) will it take Nvidia to realize they priced the majority of the gaming community out of affording, or even wanting the 2000 series? With Tom's "just buy it" feedback articles patting them on the back probably never. That is why having a "yes man" around is so bad. Sometimes you need to hear the truth to keep you from making really bad decisions. Case in point the whole Nvidia 2000 series, and the "Just Buy it" article from Tom's. Two very bad decisions that really have cost both companies a lot. For Nvidia profit and possible market share. And for Tom's trust and respect.
 

ASK THE COMMUNITY

TRENDING THREADS