News Nvidia earned nearly as much as its next 9 fabless rivals combined last year

Yet they still gave us the RTX 5000 series. You'd think the last two years of excellent sales would have spurred a bigger push in R&D. Here's hoping the 6000 series takes advantage of this massive capital and we can have a repeat of the 3000 series - huge jump in performance and a drop in price compared to the 2000 series.
 
  • Like
Reactions: ekio
Yet they still gave us the RTX 5000 series. You'd think the last two years of excellent sales would have spurred a bigger push in R&D. Here's hoping the 6000 series takes advantage of this massive capital and we can have a repeat of the 3000 series - huge jump in performance and a drop in price compared to the 2000 series.
I truly think they will only ever be priced against competition, and as they have none, the moon is where it will continue to be unless either AMD or Intel can begin to vie for the high end.
 
  • Like
Reactions: Jagar123
Yet they still gave us the RTX 5000 series. You'd think the last two years of excellent sales would have spurred a bigger push in R&D. Here's hoping the 6000 series takes advantage of this massive capital and we can have a repeat of the 3000 series - huge jump in performance and a drop in price compared to the 2000 series.
R&D??? That's not the problem, it's QA/QC and allocating supplies to markets other than AI. And nVidia wasn't generous or such in "giving us RTX 5000 series." It's the worst gaming dGPU launch that nVidia has possibly ever had, and if anything, AMD has closed the gap... even while "only going mid-tier" this generation.

There's nothing to learn -- it's always going to be following the money trail. nVidia wouldn't have the highest tech market cap in the world if it was some kind of "righteous" tech or gaming company; they maximized their potential as a core dGPU company and foundation with CUDA and the professional dev and enterprise community, so here we find ourselves today as is.

Sorry bud and no offense to nVidia fans but I think a lot of mindsights are far from practical life.
 
  • Like
Reactions: bit_user
Yet they still gave us the RTX 5000 series. You'd think the last two years of excellent sales would have spurred a bigger push in R&D. Here's hoping the 6000 series takes advantage of this massive capital and we can have a repeat of the 3000 series - huge jump in performance and a drop in price compared to the 2000 series.
Process node. 50 series still on the same node as 40 series. Cheaper? TSMC keep raising the price of their mature node. 5nm is not even their most bleeding edge node and we still see price increase on that.
 
  • Like
Reactions: bit_user
Yet they still gave us the RTX 5000 series. You'd think the last two years of excellent sales would have spurred a bigger push in R&D. Here's hoping the 6000 series takes advantage of this massive capital and we can have a repeat of the 3000 series - huge jump in performance and a drop in price compared to the 2000 series.
We might be reaching a point where this type of rendering architecture just can't provide any massive gains anymore without ridiculous power consumption---might need some kind of revolutionary product.
 
  • Like
Reactions: bit_user
We might be reaching a point where this type of rendering architecture just can't provide any massive gains anymore without ridiculous power consumption---might need some kind of revolutionary product.
They are trying to do it via AI and neural rendering technologies. The technology is still getting its legs under it. DLSS is just the start.

It doesn't help that many people are using ridiculously high-res 4k+ displays, which is really more than you need for gaming. So, no wonder Nvidia and AMD don't want to directly compute all of those pixels - it's not necessary and not an efficient use of silicon.
 
  • Like
Reactions: blppt
They are trying to do it via AI and neural rendering technologies. The technology is still getting its legs under it. DLSS is just the start.

It doesn't help that many people are using ridiculously high-res 4k+ displays, which is really more than you need for gaming. So, no wonder Nvidia and AMD don't want to directly compute all of those pixels - it's not necessary and not an efficient use of silicon.
Not good enough. I want a QUANTUM COMPUTING based GPU. 😉