News Doubling Down: RTX 3070 Modded with 16GB of VRAM

bigdragon

Distinguished
Oct 19, 2011
1,145
620
20,160
Very impressive work! This is even better than the last attempt to increase VRAM on a GPU. Nice to see this mod stable. It gives me hope that a 3070 16 GB will be released one day.

Personally, I'd like to see Nvidia end the 3080 and 3070 in favor of replacing them with Ti versions. I wonder if this is already going on with the 3080 given the total lack of stock drops for it for the past month at least. Having a new GPU release with a ton of day-one stock may give more of us the ability to purchase GPUs. The current stock trickles just get eaten up by bots immediately.
 
I feel like it's now a given NVIDIA is going to do half-generation refreshes. But the question of what those specs will be will make it interesting to guess. For example, the 3070 is fully realized GA-104. So unless NVIDIA went down to 5nm or a highly refined 7nm to bump up the clock speeds by a meaningful amount, throwing on 16GB to a 3070 and calling it a 3070 Ti would be a let down.

The only other thing I can think of, but can't see why they would do it is make a GA102/GA104 hybrid where there's 12 SMs per TPC like the GA102 rather than 8 SMs per TPC like the GA-104, but keep the same 6 TPC and 8 channel memory configuration.
 
I feel like it's now a given NVIDIA is going to do half-generation refreshes. But the question of what those specs will be will make it interesting to guess. For example, the 3070 is fully realized GA-104. So unless NVIDIA went down to 5nm or a highly refined 7nm to bump up the clock speeds by a meaningful amount, throwing on 16GB to a 3070 and calling it a 3070 Ti would be a let down.

The only other thing I can think of, but can't see why they would do it is make a GA102/GA104 hybrid where there's 12 SMs per TPC like the GA102 rather than 8 SMs per TPC like the GA-104, but keep the same 6 TPC and 8 channel memory configuration.
Not quite a full GA104 -- it's 46 SMs out of 48 potential SMs. Not a huge difference, but if Nvidia can get enough full GA104 chips, plus a clock speed bump, plus double the VRAM... Or like you said, maybe just do a GA102 with 56 SMs or something. I don't think Nvidia will do new Ampere GPUs, though -- they'll stick with the GA102/GA104/GA106 (and potentially GA107/GA108) until Hopper or whatever comes next. That will be 5nm most likely, or some Samsung equivalent.
 

Giroro

Splendid
Very impressive work! This is even better than the last attempt to increase VRAM on a GPU. Nice to see this mod stable. It gives me hope that a 3070 16 GB will be released one day.

Personally, I'd like to see Nvidia end the 3080 and 3070 in favor of replacing them with Ti versions. I wonder if this is already going on with the 3080 given the total lack of stock drops for it for the past month at least. Having a new GPU release with a ton of day-one stock may give more of us the ability to purchase GPUs. The current stock trickles just get eaten up by bots immediately.

There may be a massive price delta between the RTX 3080 and 3090 (MSRP for 2x 3080s is less than a single 3090, which explains why they killed SLI) , but there is almost no performance difference in gaming. So what would a 3080 Ti even be? Would it just exactly split the difference and be a 16GB card with 4-6% better performance for $1100? I don't think a 16 GB arrangement is even possible on GA102 without dropping the bus width. Micron only lists an 8Gb (1GB) GDDR6X chip in 19Gbps and 21Gbps speeds. Nvidia isn't currently using the 21 Gbps chips, but they could in the future. Some of the arrangements that we can assume possible are:

Total MemoryNumber of chipsBus WidthBandwidth
10 GB (RTX 3080)10 (x16 mode?)320 bit @ 19 Gbps (1188 MHz)760.3 GB/s
12 GB12384 bit @ 19-21 Gbps912 - 1,008 GB/s
16 GB16256 bit @ 19-21 Gbps608 - 672 GB/s
20 GB20320 bit @19-21 Gbps760.3 - 840 GB/s
24 GB (RTX 3090)24 (x8 mode?)384 bit @19.5 Gbps (1219 MHz)936.2 GB/s

Maybe they would do a 20GB card, used mostly as a marketing gimmick to justify a several-hundred-dollar price premium over AMD's 16GB 6900 XT @ $999... Or they could ignore that, because they would sell every card produced, regardless.
I don't see a 16GB card happening, unless Micron has a weird 10.6Gb or 12.8 Gb chip in the works. Or maybe a 16Gbit chip that can be mixed and matched with the 8Gbit.
Personally I think they have already released their RTX 3080 Ti, they just happened to call it RTX 3080.
Either way, I think the last thing the market needs right now is more high-end SKUs with big chips and big memory. What Nvidia needs is a 4GB card with ~18 SMs, targeting 1080p with DLSS (no RTX) on a die that is smaller than 1/4 the size of GA102. The market basically needs a modern card that is roughly a quarter of RTX 3080, so they can make 4x as many of them at (hopefully) 1/4 the price.

I would even be happy if AMD and Nvidia would at least restart production on some of their old low-midrange cards. People who need an office PC shouldn't be forced to buy a high-end gaming card out from under somebody who actually wants it. Where is GT 1010?
 
Either way, I think the last thing the market needs right now is more high-end SKUs with big chips and big memory. What Nvidia needs is a 4GB card with ~18 SMs, targeting 1080p with DLSS (no RTX) on a die that is smaller than 1/4 the size of GA102. The market basically needs a modern card that is roughly a quarter of RTX 3080, so they can make 4x as many of them at (hopefully) 1/4 the price.
Considering if optimized/developed well, a 2060 Super can get 60FPS at 1080 with all the bells and whistles including DLSS, I'm hopeful a possible 3050 would just be the equivalent of that in a 107 die.

I would even be happy if AMD and Nvidia would at least restart production on some of their old low-midrange cards. People who need an office PC shouldn't be forced to buy a high-end gaming card out from under somebody who actually wants it. Where is GT 1010?
I don't really think there's a need for lower end cards below the $150 mark anymore. Every CPU for consumers made by Intel has a GPU. AMD looks like they're trying to do that with their laptop CPUs but nothing's really stopping a system builder from using a laptop system in an AIO or SFF desktop. In any case, iGPUs are sufficient for most tasks people typically do and some of the higher end ones effectively make the <$150 video card market redundant. Plus for system builders, a video card is an extra line item they have to support. Getting rid of that if possible would save them money.
 
  • Like
Reactions: JarredWaltonGPU

kaalus

Distinguished
Apr 23, 2008
90
65
18,610
What is a "8GB frame buffer limit" at 4k resolution? Frame buffer at 4k requires at most 64MB (assuming HDR color, 24 bit depth buffer and 8 bit stencil). Even with triple buffering that no one is using these days, this sums up to a paltry 192MB. Less than 3% of memory on an 8GB card.
 
What is a "8GB frame buffer limit" at 4k resolution? Frame buffer at 4k requires at most 64MB (assuming HDR color, 24 bit depth buffer and 8 bit stencil). Even with triple buffering that no one is using these days, this sums up to a paltry 192MB. Less than 3% of memory on an 8GB card.
Perhaps "frame buffer" isn't exactly the correct term, but to really take advantage of the resolution you do need high resolution textures which eat up a bulk of VRAM. Plus games still render to many "frame buffers" (called render targets) for certain effects before compositing the final output. There's a slide deck from one of Sony's game developers showing that they were using something like 800MB on render targets for a 1080p output.
 
Perhaps "frame buffer" isn't exactly the correct term, but to really take advantage of the resolution you do need high resolution textures which eat up a bulk of VRAM. Plus games still render to many "frame buffers" (called render targets) for certain effects before compositing the final output. There's a slide deck from one of Sony's game developers showing that they were using something like 800MB on render targets for a 1080p output.
Directed at the OP and backing up Hotaru:
It's not just frame buffer for sure, and you can see the amount of memory 'required' by games balloon as you go from 1080p to 1440p to 4K -- well, in some games anyway. Red Dead Redemption 2 for example:

1080p Max (no SSAA) = 4347 MB for RDR2, 995 MB "Other", 5342 MB total
1440p Max (no SSAA) = 4666 MB for RDR2, 995 MB "Other", 5661 MB total
4K Max (no SSAA) = 5537 MB for RDR2, 995 MB "Other", 6532 MB total

So, that's 319 MB extra to go from 1080p to 1440p, and 871 MB extra to go from 1440p to 4K. Considering an entire 32-bit frame only requires a bit more than 14MB at 1440p, that would suggest there are around 23 different buffers or something? A 4K frame would take just under 32MB, and the scaling is even higher there (~27.5X the frame size in increased memory footprint). Probably because at 4K the game also stores more MIPMAPS or something, I don't know for sure. But these numbers from RDR2 aren't outliers -- most games need about 1GB more VRAM at 4K than at 1440p, and it's often enough of a jump to push beyond the VRAM limits of 6GB cards. That doesn't mean the games suddenly can't run at all on a 6GB card, but proportionately there's a bigger hit to performance due to memory swapping required at 4K.
 

Giroro

Splendid
Considering if optimized/developed well, a 2060 Super can get 60FPS at 1080 with all the bells and whistles including DLSS, I'm hopeful a possible 3050 would just be the equivalent of that in a 107 die.


I don't really think there's a need for lower end cards below the $150 mark anymore. Every CPU for consumers made by Intel has a GPU. AMD looks like they're trying to do that with their laptop CPUs but nothing's really stopping a system builder from using a laptop system in an AIO or SFF desktop. In any case, iGPUs are sufficient for most tasks people typically do and some of the higher end ones effectively make the <$150 video card market redundant. Plus for system builders, a video card is an extra line item they have to support. Getting rid of that if possible would save them money.

I'm going to guess what you meant by $150 cards, because even the worst old gaming cards on the market cost more than $150 right now. If you consider the $199 "class" of GPU, the GTX 1060 has consistently dominated the top of the steam charts for several years.

Low end graphics (at least when launched) still outperform iGPUs by a wide margin. Plus they add different/newer/more display outputs. A GT 1010 is very necessary in a world where the only GPU for sale in town is a $90 Geforce 210, which I'm reasonably sure cannot even output 4k60.
How many motherboards out there have multiple DP1.4 ports on them? Personally, I would be very happy if I could get a GT 1010 for the ~$50 that the GDDR varients of the GT 710 usually cost back before they all sold out everywhere forever.

If you mean that there is no demand for low end gaming/productivity cards like a GTX 1030 / 1050 ti, well every possible metric disagrees. Sure those cards aren't that great at modern gaming, but they're 2 generations old. What do you expect? They are still far far superior to integrated graphics. Just because a modern/last gen equivalent doesn't exist, doesn't mean that people don't want them. Gtx 1650 does exist to replace the 1030 in the lowest performance class, but with ~50% higher launch prices than the older cards - just like everything else last gen. That's a part of the problem, nvidia keeps linearly increasing pricing with performance so you can't really categorize cards by price. I'm sure most customers would absolutely love it if a future "RTX 3050 ti" launched at the same $139 as the 1050 ti. They would absolutely lose their minds trying to buy it - moreso than the already overwhelming demand for an RTX 3060 at $330. But that's how progress used to go: more performance and features at the same price points.

Every consumer CPU from Intel does not have a GPU, see "F" models. For AMD, none of their Zen3 desktop processors have integrated graphics, and none of their mid to high-end processors in general have one. There's their low end Zen2 APUs, but those are just as imaginary right now as everything else with a half decent GPU. AMD hypothetically is making some promising laptop APUs (I assume supply is also tight there), but you can't put a laptop APU in a desktop, and you can't put a dedicated GPU in a laptop. It's a different market. Even still, theres still plenty of laptops using something like an nvidia M150 just to get better desktop performance over Intel igpu.
And yes, a system integrator could probably put a laptop CPU into a desktop PC if they wanted, but they are the only people who can still but gpus, so I doubt they would put more development into that category than they already are.

It still bothers me how quickly we got used to the RTX 20 series pricing. As I recall, those cards were universally panned at launch for being obscenely overpriced, and Tom's was universally panned for recommending that people 'just buy it'. Then, when Nvidia largely keeps that pricing scheme for the 30 series, we all flip out like it's the best pro-customer deal ever? I don't get it, but I guess that's the power of a monopoly.

Also to the other point, I get that an RTX 2060 is a good card for 1080p. But that gpu was too big and too expensive, which is why the 1060 is still king of the mainstream.
What I'm asking for is a minimum-viable mainstream card on the smallest die possible, in order to drastically increase production, and eventually decrease cost. (As in a "$150" TU116/117 calss of gpu in terms of design). The superior architecture and manufacturing process should put performance somewhere in the GTX 1660S or GTX 1070 ballpark. RTX would not be useful in this performance class, and the RT cores are separate from the SMs so they could be removed more easily than the tensor cores. Nvidia needs faster production far more than they need a niche graphical feature which makes makes 30ish games prettier, with the tradeoff of being unplayable.

A $250+ 6/8GB RTX 3050/ Ti targeting GTX 1080 performance on a ~200 or 246 mm die at isn't enough to push through their production bottlenecks. If Samsung can't make enough big GPUs, then they need the minimum viable die. If they can't buy enough memory, then they need the minimum viable memory. Why not start with something that does both, then with about upselling customers once they actually have a product available on the market? The sad thing is, they could probably still sell the "minimum viable gaming" card at $250 and move every single one that can be produced for the foreseeable future - but eventually they could either stop production or drop the design down to mobile or a 2GB gt 1030 replacement.
 
Last edited:
I'm going to guess what you meant by $150 cards, because even the worst old gaming cards on the market cost more than $150 right now. If you consider the $199 "class" of GPU, the GTX 1060 has consistently dominated the top of the steam charts for several years.
I'm referring to MSRP, not market pricing. Any time I talk about pricing, I prefer using MSRP because that's a more or less fixed value. Market value fluctuates over time and today's arguments based on such become useless in tomorrow's conversation.

Low end graphics (at least when launched) still outperform iGPUs by a wide margin. Plus they add different/newer/more display outputs. A GT 1010 is very necessary in a world where the only GPU for sale in town is a $90 Geforce 210, which I'm reasonably sure cannot even output 4k60.
And it didn't take very long for iGPUs to catch up in performance. For instance the Vega 11 in the Ryzen 2400G can keep up with the GT 1030. And while I can see a use case of using a computer in a home theater setup, I'm still under the impression this is a highly niche market and most people don't need 4K for regular computer use, let alone are using a monitor of that resolution.

How many motherboards out there have multiple DP1.4 ports on them? Personally, I would be very happy if I could get a GT 1010 for the ~$50 that the GDDR varients of the GT 710 usually cost back before they all sold out everywhere forever.
And how many people actually need DP1.4 capabilities? The only thing I can see that it has over older versions is up-to-date support for HDR video standards and 8K. At which point, if I wanted those I'd rather have an HDMI based system because that's plug-and-play with so many other devices. Not to mention if this is for multimedia stuff, even a $50 device can handle 4K HDR.

If you mean that there is no demand for low end gaming/productivity cards like a GTX 1030 / 1050 ti, well every possible metric disagrees. Sure those cards aren't that great at modern gaming, but they're 2 generations old. What do you expect? They are still far far superior to integrated graphics. Just because a modern/last gen equivalent doesn't exist, doesn't mean that people don't want them.
The 1050 Ti sure, because it's within the threshold of when a discrete GPU starts looking like a good option. But the 1030? I'd argue no. Again the Vega 11 can keep up with it and even the Vega 8 still puts up a decent fight. Ultimately though, the GT 1030 still sucks at anything other than 720p low quality on games that weren't designed with a potato in mind.

Gtx 1650 does exist to replace the 1030 in the lowest performance class, but with ~50% higher launch prices than the older cards - just like everything else last gen. That's a part of the problem, nvidia keeps linearly increasing pricing with performance so you can't really categorize cards by price. I'm sure most customers would absolutely love it if a future "RTX 3050 ti" launched at the same $139 as the 1050 ti. They would absolutely lose their minds trying to buy it - moreso than the already overwhelming demand for an RTX 3060 at $330. But that's how progress used to go: more performance and features at the same price points.[
No it doesn't. It's to replace the GTX 1050. I don't categorize by model number because model numbers are arbitrary. I categorize by price because you can pair with another metric: performance. Both AMD and NVIDIA price to performance and whoever's "ahead" gets to dictate that price unfortunately. AMD's guilty of this too, and as recently as their Zen 3 processors. In any case, the GTX 1650 is only 7% more at launch MSRP than the GTX 1050, and it provides a ~20% performance bump all around.

Also everyone likes to look at Pascal and its pricing/performance ratio as some sort of mystical baseline for how that ratio should be without understanding the circumstances around it and say Turing. Pascal was a two generation node bump, so not only did it get the architectural improvements, it got two generations worth of process improvements to go with it. Turing was built on a half-generation process improvement. This is also on top of semiconductor manufacturing getting more expensive past 22nm.


Every consumer CPU from Intel does not have a GPU, see "F" models.
A SKU that slipped my mind unfortunately. However, looking into it more, the F series looked more like a move of desperation on Intel's part to ship more CPUs out. But anyway, most computers from system builders use an Intel CPU with an IGPU.

AMD hypothetically is making some promising laptop APUs (I assume supply is also tight there), but you can't put a laptop APU in a desktop
Hypothetically? I have a laptop with one of those mystical APUs. Also my comment was about system builders using them in desktop systems, not putting a laptop part in a desktop motherboard.

and you can't put a dedicated GPU in a laptop.
Maybe not in a laptop, but you can certainly hook up a desktop GPU to a laptop with the right connection.

It's a different market. Even still, theres still plenty of laptops using something like an nvidia M150 just to get better desktop performance over Intel igpu.
Except in practically every hybrid graphics laptop system, the IGPU does all the desktop rendering work by default. And in the last four laptops I've used that had hybrid graphics, I've once never felt a need to tell the dedicated GPU to run one of my desktop apps. I argue those laptops exist simply to sucker in someone into getting something with "dedicated graphics."

It still bothers me how quickly we got used to the RTX 20 series pricing. As I recall, those cards were universally panned at launch for being obscenely overpriced, and Tom's was universally panned for recommending that people 'just buy it'. Then, when Nvidia largely keeps that pricing scheme for the 30 series, we all flip out like it's the best pro-customer deal ever? I don't get it, but I guess that's the power of a monopoly.
I'm pretty sure nobody flipped out like it's the best pro-customer deal ever. If anything, before the launch, we were all skeptical of NVIDIA's 2x across the board performance and "let's see what AMD comes up with" and post launch it's "WTF NVIDIA why can't I get a card?"

Again, I would also like to point out that it's getting more expensive to go smaller on top of GPU performance scaling with the number of execution units. Die size trends remain largely flat as a result, at least in NVIDIA's designs.

Ultimately here's my take on why I don't really care for <$150 cards: If you want to game, you're probably already saving like $100-$120 for a card. However, saving up to $150 isn't likely out of your reach either. Even better, depending on how patient you are, you could save up for a $200-$220 card and get something that'll provide plenty of gaming chops for years. So I don't see a point in <$150 cards for something other than to just put something in the computer to have a video output or add more outputs. There might be a case for using cheaper cards in multimedia PCs, but a dedicated streaming device is much more convenient, actually designed for couch use, supports all the media bells and whistles, and tends to be much cheaper than a graphics card.
 
Last edited:

Giroro

Splendid
No it doesn't.It's to replace the GTX 1050. I don't categorize by model number because model numbers are arbitrary.
I agree that model number (and pricing) is arbitrary, which is why I beleive the GTX 1650 replaces the 1030 as the 'poor value bottom of the barrel junk that no serious gamer should buy'. The GTX 1650S makes more sense as a replacement in terms of user sentiment.
Another way to look at it is die size, in which case the 1650 replaces the 1060 at 200mm. Although the core configuration would put it somewhere between a 1050 Ti and a 1060.
Which brings me to

Again, I would also like to point out that it's getting more expensive to go smaller on top of GPU performance scaling with the number of execution units. Die size trends remain largely flat as a result, at least in NVIDIA's designs
Within the same process, it is always cheaper to have a smaller die. You don't buy chips, you buy wafers. A smaller die means more chips for wafer, and less to lose when there is a flaw.
As far as die size remaining flat? I'm not really seeing that within Nvidia's product line. An RTX 2080 Ti was about 60% larger than a 1080 Ti on the same process. RTX 2060 is 220% the size of a GTX 1060. The 3070 is 25% larger than a GTX 1080, despite the process shrink. GTX 1050 is 132 mm and 1030 is 74mm. They haven't made anything comparable to those in a long time.
If silicon is so much more expensive to produce now, then why do they keep making their GPUs bigger? I guess because they can charge a lot more for the final product compared to the extra cost of production, increasing margins.

Hypothetically? I have a laptop with one of those mystical APUs. Also my comment was about system builders using them in desktop systems, not putting a laptop part in a desktop motherboard.
A zen3 chiplet is a zen3 chiplet. Soldering it to a motherboard doesn't change the part where AMD is bottlenecked at TSMC. If AMD can't produce enough Processors and GPUs, then I see no reason why they wouldn't also be supply constrained on laptop APUs. Did you buy that laptop within the last month?
For what it's worth, I would very much love if AMD brought out desktop versions of their mobile APUs as an option. That would be good for everybody, especially today when people building a Zen3 computer don't have a lot of options for getting their PC running. It would be nice to be able to build a working Ryzen PC today, and then wait to buy a GPU when they start becoming available.

And how many people actually need DP1.4 capabilities? The only thing I can see that it has over older versions is up-to-date support for HDR video standards and 8K. At which point, if I wanted those I'd rather have an HDMI based system because that's plug-and-play with so many other devices. Not to mention if this is for multimedia stuff, even a $50 device can handle 4K HDR.

A $50 device is(was) a GT 710, and no it can't output 4K HDR. The choices on that card are DVI, VGA, and HDMI 1.4 (SDR 4k@30Hz). A modern equivalent to a GT 710 would absolutely be able to do that. That is at the core of the issue that I'm digging at. The GT 710 replacement hasn't made it to stores, and it badly needs to.
The exact DP version aside, most consumer motherboards don't come with 2 HDMI ports either.

I think you are still looking at the GPU crisis from a perspective that is too gaming focused. I am coming at it from the other direction. Whether your mom on a 5 year old i5 wants to run dual monitors, or spreadsheets at 4k60, or watch UHD youtube, or play the Sims 4, or make Zoom run smoothly - then the a single slot GT 1030 was a readily available "good enough" $80 option - because she doesn't want to buy a new power supply, or an entire new AMD computer. Now her only option is to fight scalpers on ebay for a $600 RTX 3060, which might not even fit in a Dell or whatever. People who don't know a lot about video cards are more likely to overspend, which in turn makes cards more expensive for everybody. I wouldn't game on a low end card either, but I believe that they need to exist and be easily to get.
Although this is all largely academic, because again, the GT 1030 and AMD's APUs are both sold out/scalped, and there isn't any evidence right now that either are in production in any meaningful volume. I assumed Nvidia killed the GT 1030 3 months ago when they announced the GT 1010, which as far as I know still hasn't reached retail.

As for $200-$220 class cards (GTX 1060, GTX ) If we had actual progress this generation, than a $150 tier card would be at or above that performance level, which would be a good thing. I believe that Nvidia will not release a sub $250 RTX card this generation, possibly ever.
I believe that there are a lot of different types of mainstream customers with different needs, and that there should be a full range of mainstream products to fit those needs. It only helps the gaming market, by lowering the barrier of entry. If the low end becomes newer/better, then the tide rises for everybody.