News Nvidia Announces GeForce RTX 3050, RTX 3090 Ti and Laptop 3070 Ti and 3080 Ti

avenge

Distinguished
Jun 2, 2014
65
0
18,640
So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050? Maybe it will run games @2k resolution or more and max settings? What happened with 3070 TI 16 GB?
 
So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050? Maybe it will run games @2k resolution or more and max settings? What happened with 3070 TI 16 GB?
Last rumor/leak was that it was "delayed" a few months, probably due to a lack of sufficient 16Gb GDDR6X modules. Which are probably going into the 3090 Ti first.

As for the 8GB, it's better than 4GB. Granted, most of the time a GPU of this level won't strictly need 8GB, but I'd much rather have 8GB than 4GB at this stage — even if most of the time I was playing using settings that didn't exceed 4GB, just having the potential to push a bit higher is nice. Then again, most of the time I leave an RTX 3090 in my system. :D
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
It's either that or 4GB, and given that the RTX 3050 should be better than the RTX 2060 judging by the specs, I'd rather have the 8GB.
8GB is what makes it a viable target for Ethereum miners. 4GB would mean you're only competing with other gamers for the cards. If you're stuck paying $350 for a "lowend" GPU, would you rather it be the current 1650 or a 4GB RTX 3050? The price is still jacked up, but not having to compete with miner's bottomless demand means, you may actually be able to buy a 4GB card and not have to pay a 100%+ markup over MSRP.
 
8GB is what makes it a viable target for Ethereum miners. 4GB would mean you're only competing with other gamers for the cards. If you're stuck paying $350 for a "lowend" GPU, would you rather it be the current 1650 or a 4GB RTX 3050? The price is still jacked up, but not having to compete with miner's bottomless demand means, you may actually be able to buy a 4GB card and not have to pay a 100%+ markup over MSRP.
Shipping it with 4GB is going to limit its useful longevity, and your concern is only one for the DIY market. System builders won't have this issue.

EDIT: Also the card's going to get scalped and price jacked regardless. Even the 1650 is going for nearly double its original MSRP.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Shipping it with 4GB is going to limit it's useful longevity, and your concern is only one for the DIY market. System builders won't have this issue.
Why choose one or the other? Why aren't 2 different memory configurations an option? I wouldn't recommend anyone buy for useful longevity in this market. Either buy what you want, or buy the cheapest option you can use as a stopgap. If PC gaming wants a prosperous future, GPU prices will have to normalize within the next year or 2. Barely better than IGP $300 dGPU's with $1000 midrange and $2000+ high end with off hour basement mining to offset the cost is not sustainable long term.
 

InvalidError

Titan
Moderator
So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050?
Firefox and Chrome alone can gobble up around 2GB of VRAM each if you leave GPU acceleration turned on. I had to force browsers to run on the IGP to stop them from hogging my GTX1050's memory and causing artifacts while gaming. For people who like using browsers with GPU acceleration enabled while gaming, 4GB would be an awfully close shave. I'd say 6GB is the bare minimum for comfort.
 
  • Like
Reactions: JarredWaltonGPU
Why choose one or the other? Why aren't 2 different memory configurations an option?
Because this creates a problem with production and essentially doubles the complexity of the supply chain. AMD and NVIDIA seem to be simplifying their VRAM chip orders to 2GB chips only. Adding 1GB chips into the mix will cut into the 2GB chip supply in some form or fashion. Not to mention if the lower configuration doesn't pan out for some reason, you're now left with stock that you can't use. Sure you could nerf the 2GB chips to 1GB, but... why not just use 2GB? And while I can't say this is a thing for memory, I'm sure the yield rate for 2GB chips is high enough now that there's no point in trying to make 1GB chips out of the defects.

I wouldn't recommend anyone buy for useful longevity in this market.
And some people only have so much money to spend and want something that will last at least 4-5 years.

But again, your conjecture that selling 4GB card will allow the 3050 to have a better chance in the DIY market because it won't be price jacked by miners doesn't look promising when 1650s are going for way more than their MSRP.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Because this creates a problem with production and essentially doubles the complexity of the supply chain. AMD and NVIDIA seem to be simplifying their VRAM chip orders to 2GB chips only. Adding 1GB chips into the mix will cut into the 2GB chip supply in some form or fashion. Not to mention if the lower configuration doesn't pan out for some reason, you're now left with stock that you can't use. Sure you could nerf the 2GB chips to 1GB, but... why not just use 2GB? And while I can't say this is a thing for memory, I'm sure the yield rate for 2GB chips is high enough now that there's no point in trying to make 1GB chips out of the defects.

It is common practice to have OEM exclusive models. Make the 8GB model OEM only, and 4GB model AIB. Don't really want to hear about supply chain complexity. Look how many 3000 series models Nvidia has released since launch. Then look at the stupid number of models the major AIB partners sell for each model. Supply chain complexity is not an obstacle for 2 memory configurations for the same GPU.

And some people only have so much money to spend and want something that will last at least 4-5 years.

But again, your conjecture that selling 4GB card will allow the 3050 to have a better chance in the DIY market because it won't be price jacked by miners doesn't look promising when 1650s are going for way more than their MSRP.
I know what buying for longevity means. What I am saying is it is stupid to do that in the current market. Over paying an extra $300-500 over MSRP for a couple extra years of use is dumb. Buy the cheapest you can use to minimize the money you are throwing away and assume in 18-24 months, the money you save alone not going higher end will buy a much better card if prices creep back towards more sane levels. Nothing is going to sell for MSRP now, but the lower in the chain you go, the less over msrp you will pay.
 
It is common practice to have OEM exclusive models. Make the 8GB model OEM only, and 4GB model AIB. Don't really want to hear about supply chain complexity. Look how many 3000 series models Nvidia has released since launch. Then look at the stupid number of models the major AIB partners sell for each model. Supply chain complexity is not an obstacle for 2 memory configurations for the same GPU.
Most of those come after the commercially available video cards have had their run and not just out of NVIDIA's butt. They come out of enough companies committing to a shipping a product if NVIDIA delivers.

I know what buying for longevity means. What I am saying is it is stupid to do that in the current market. Over paying an extra $300-500 over MSRP for a couple extra years of use is dumb. Buy the cheapest you can use to minimize the money you are throwing away and assume in 18-24 months, the money you save alone not going higher end will buy a much better card if prices creep back towards more sane levels. Nothing is going to sell for MSRP now, but the lower in the chain you go, the less over msrp you will pay.
Or you can not spend a thing and wait 18-24 months. The advice you're suggesting only seems useful if your video card blew up and buying a new system outright isn't viable.

But again, only the DIY market has this problem. System builders don't.
 

InvalidError

Titan
Moderator
I know what buying for longevity means. What I am saying is it is stupid to do that in the current market. Over paying an extra $300-500 over MSRP for a couple extra years of use is dumb.
I doubt the price difference between a 4GB and 8GB RTX3050 would be $300+. If you are already going to be paying $250 street price over baseline what the GPU should be worth for a 4GB model, stepping up to $300 over baseline for a 8GB model is fair enough, only $50 extra to most likely never need to worry about running out of VRAM through the card's entire useful life.

The 4GB RX5500 has already demonstrated that 4GB is getting uncomfortably tight.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
I doubt the price difference between a 4GB and 8GB RTX3050 would be $300+.
I would think not as well. I meant going from a 3060 ti to a 3070 or a 3070 to a 3080. In a normal market, bumping from $450 for a 3060Ti to $550 for a 3070 for additional longevity makes sense. In the current market where a 3060ti sells for $850 and 3070's are selling for $1300, rather than a $100 gap, you're looking at a $450 gap. I would not recommend spending an additional $350 over MSRP in attempt to increase the longevity of your GPU purchase.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Or you can not spend a thing and wait 18-24 months. The advice you're suggesting only seems useful if your video card blew up and buying a new system outright isn't viable.
For most people right now, your first part is what's going to happen. Just wait out the market and see what happens. For most gamers, $700 for a PS5 from a scalper makes the most financial sense right now.
 
3050 most likely going to perform like 1660Ti/GTX1070 instead of RTX2060.
I dunno, the 3050 has 66% more shaders and has basically the same clocks as the 1660 Ti. If the 3050 performs the same as the 1660 Ti, then that's a major WTF NVIDIA moment. The only area where it might be weaker is fillrate performance, but I don't think that's much of a concern.
 
I dunno, the 3050 has 66% more shaders and has basically the same clocks as the 1660 Ti. If the 3050 performs the same as the 1660 Ti, then that's a major WTF NVIDIA moment. The only area where it might be weaker is fillrate performance, but I don't think that's much of a concern.

that's just my rough speculation looking at 3050 spec vs 3060. 3050 will have 70% of 3060 CC. TPU have relative chart performance based on percentage. at 70% mark it was GTX1070/GTX1660Ti. nvidia probably saving 3050Ti with 3000 CC for later. this one i believe will be matching or faster than 2060.