A YouTuber has doubled an RTX 3070's memory capacity from 8GB of VRAM to 16GB.
Doubling Down: RTX 3070 Modded with 16GB of VRAM : Read more
Doubling Down: RTX 3070 Modded with 16GB of VRAM : Read more
Not quite a full GA104 -- it's 46 SMs out of 48 potential SMs. Not a huge difference, but if Nvidia can get enough full GA104 chips, plus a clock speed bump, plus double the VRAM... Or like you said, maybe just do a GA102 with 56 SMs or something. I don't think Nvidia will do new Ampere GPUs, though -- they'll stick with the GA102/GA104/GA106 (and potentially GA107/GA108) until Hopper or whatever comes next. That will be 5nm most likely, or some Samsung equivalent.I feel like it's now a given NVIDIA is going to do half-generation refreshes. But the question of what those specs will be will make it interesting to guess. For example, the 3070 is fully realized GA-104. So unless NVIDIA went down to 5nm or a highly refined 7nm to bump up the clock speeds by a meaningful amount, throwing on 16GB to a 3070 and calling it a 3070 Ti would be a let down.
The only other thing I can think of, but can't see why they would do it is make a GA102/GA104 hybrid where there's 12 SMs per TPC like the GA102 rather than 8 SMs per TPC like the GA-104, but keep the same 6 TPC and 8 channel memory configuration.
Very impressive work! This is even better than the last attempt to increase VRAM on a GPU. Nice to see this mod stable. It gives me hope that a 3070 16 GB will be released one day.
Personally, I'd like to see Nvidia end the 3080 and 3070 in favor of replacing them with Ti versions. I wonder if this is already going on with the 3080 given the total lack of stock drops for it for the past month at least. Having a new GPU release with a ton of day-one stock may give more of us the ability to purchase GPUs. The current stock trickles just get eaten up by bots immediately.
Total Memory | Number of chips | Bus Width | Bandwidth |
---|---|---|---|
10 GB (RTX 3080) | 10 (x16 mode?) | 320 bit @ 19 Gbps (1188 MHz) | 760.3 GB/s |
12 GB | 12 | 384 bit @ 19-21 Gbps | 912 - 1,008 GB/s |
16 GB | 16 | 256 bit @ 19-21 Gbps | 608 - 672 GB/s |
20 GB | 20 | 320 bit @19-21 Gbps | 760.3 - 840 GB/s |
24 GB (RTX 3090) | 24 (x8 mode?) | 384 bit @19.5 Gbps (1219 MHz) | 936.2 GB/s |
Considering if optimized/developed well, a 2060 Super can get 60FPS at 1080 with all the bells and whistles including DLSS, I'm hopeful a possible 3050 would just be the equivalent of that in a 107 die.Either way, I think the last thing the market needs right now is more high-end SKUs with big chips and big memory. What Nvidia needs is a 4GB card with ~18 SMs, targeting 1080p with DLSS (no RTX) on a die that is smaller than 1/4 the size of GA102. The market basically needs a modern card that is roughly a quarter of RTX 3080, so they can make 4x as many of them at (hopefully) 1/4 the price.
I don't really think there's a need for lower end cards below the $150 mark anymore. Every CPU for consumers made by Intel has a GPU. AMD looks like they're trying to do that with their laptop CPUs but nothing's really stopping a system builder from using a laptop system in an AIO or SFF desktop. In any case, iGPUs are sufficient for most tasks people typically do and some of the higher end ones effectively make the <$150 video card market redundant. Plus for system builders, a video card is an extra line item they have to support. Getting rid of that if possible would save them money.I would even be happy if AMD and Nvidia would at least restart production on some of their old low-midrange cards. People who need an office PC shouldn't be forced to buy a high-end gaming card out from under somebody who actually wants it. Where is GT 1010?
Perhaps "frame buffer" isn't exactly the correct term, but to really take advantage of the resolution you do need high resolution textures which eat up a bulk of VRAM. Plus games still render to many "frame buffers" (called render targets) for certain effects before compositing the final output. There's a slide deck from one of Sony's game developers showing that they were using something like 800MB on render targets for a 1080p output.What is a "8GB frame buffer limit" at 4k resolution? Frame buffer at 4k requires at most 64MB (assuming HDR color, 24 bit depth buffer and 8 bit stencil). Even with triple buffering that no one is using these days, this sums up to a paltry 192MB. Less than 3% of memory on an 8GB card.
Directed at the OP and backing up Hotaru:Perhaps "frame buffer" isn't exactly the correct term, but to really take advantage of the resolution you do need high resolution textures which eat up a bulk of VRAM. Plus games still render to many "frame buffers" (called render targets) for certain effects before compositing the final output. There's a slide deck from one of Sony's game developers showing that they were using something like 800MB on render targets for a 1080p output.
Considering if optimized/developed well, a 2060 Super can get 60FPS at 1080 with all the bells and whistles including DLSS, I'm hopeful a possible 3050 would just be the equivalent of that in a 107 die.
I don't really think there's a need for lower end cards below the $150 mark anymore. Every CPU for consumers made by Intel has a GPU. AMD looks like they're trying to do that with their laptop CPUs but nothing's really stopping a system builder from using a laptop system in an AIO or SFF desktop. In any case, iGPUs are sufficient for most tasks people typically do and some of the higher end ones effectively make the <$150 video card market redundant. Plus for system builders, a video card is an extra line item they have to support. Getting rid of that if possible would save them money.
I'm referring to MSRP, not market pricing. Any time I talk about pricing, I prefer using MSRP because that's a more or less fixed value. Market value fluctuates over time and today's arguments based on such become useless in tomorrow's conversation.I'm going to guess what you meant by $150 cards, because even the worst old gaming cards on the market cost more than $150 right now. If you consider the $199 "class" of GPU, the GTX 1060 has consistently dominated the top of the steam charts for several years.
And it didn't take very long for iGPUs to catch up in performance. For instance the Vega 11 in the Ryzen 2400G can keep up with the GT 1030. And while I can see a use case of using a computer in a home theater setup, I'm still under the impression this is a highly niche market and most people don't need 4K for regular computer use, let alone are using a monitor of that resolution.Low end graphics (at least when launched) still outperform iGPUs by a wide margin. Plus they add different/newer/more display outputs. A GT 1010 is very necessary in a world where the only GPU for sale in town is a $90 Geforce 210, which I'm reasonably sure cannot even output 4k60.
And how many people actually need DP1.4 capabilities? The only thing I can see that it has over older versions is up-to-date support for HDR video standards and 8K. At which point, if I wanted those I'd rather have an HDMI based system because that's plug-and-play with so many other devices. Not to mention if this is for multimedia stuff, even a $50 device can handle 4K HDR.How many motherboards out there have multiple DP1.4 ports on them? Personally, I would be very happy if I could get a GT 1010 for the ~$50 that the GDDR varients of the GT 710 usually cost back before they all sold out everywhere forever.
The 1050 Ti sure, because it's within the threshold of when a discrete GPU starts looking like a good option. But the 1030? I'd argue no. Again the Vega 11 can keep up with it and even the Vega 8 still puts up a decent fight. Ultimately though, the GT 1030 still sucks at anything other than 720p low quality on games that weren't designed with a potato in mind.If you mean that there is no demand for low end gaming/productivity cards like a GTX 1030 / 1050 ti, well every possible metric disagrees. Sure those cards aren't that great at modern gaming, but they're 2 generations old. What do you expect? They are still far far superior to integrated graphics. Just because a modern/last gen equivalent doesn't exist, doesn't mean that people don't want them.
No it doesn't. It's to replace the GTX 1050. I don't categorize by model number because model numbers are arbitrary. I categorize by price because you can pair with another metric: performance. Both AMD and NVIDIA price to performance and whoever's "ahead" gets to dictate that price unfortunately. AMD's guilty of this too, and as recently as their Zen 3 processors. In any case, the GTX 1650 is only 7% more at launch MSRP than the GTX 1050, and it provides a ~20% performance bump all around.Gtx 1650 does exist to replace the 1030 in the lowest performance class, but with ~50% higher launch prices than the older cards - just like everything else last gen. That's a part of the problem, nvidia keeps linearly increasing pricing with performance so you can't really categorize cards by price. I'm sure most customers would absolutely love it if a future "RTX 3050 ti" launched at the same $139 as the 1050 ti. They would absolutely lose their minds trying to buy it - moreso than the already overwhelming demand for an RTX 3060 at $330. But that's how progress used to go: more performance and features at the same price points.[
A SKU that slipped my mind unfortunately. However, looking into it more, the F series looked more like a move of desperation on Intel's part to ship more CPUs out. But anyway, most computers from system builders use an Intel CPU with an IGPU.Every consumer CPU from Intel does not have a GPU, see "F" models.
Hypothetically? I have a laptop with one of those mystical APUs. Also my comment was about system builders using them in desktop systems, not putting a laptop part in a desktop motherboard.AMD hypothetically is making some promising laptop APUs (I assume supply is also tight there), but you can't put a laptop APU in a desktop
Maybe not in a laptop, but you can certainly hook up a desktop GPU to a laptop with the right connection.and you can't put a dedicated GPU in a laptop.
Except in practically every hybrid graphics laptop system, the IGPU does all the desktop rendering work by default. And in the last four laptops I've used that had hybrid graphics, I've once never felt a need to tell the dedicated GPU to run one of my desktop apps. I argue those laptops exist simply to sucker in someone into getting something with "dedicated graphics."It's a different market. Even still, theres still plenty of laptops using something like an nvidia M150 just to get better desktop performance over Intel igpu.
I'm pretty sure nobody flipped out like it's the best pro-customer deal ever. If anything, before the launch, we were all skeptical of NVIDIA's 2x across the board performance and "let's see what AMD comes up with" and post launch it's "WTF NVIDIA why can't I get a card?"It still bothers me how quickly we got used to the RTX 20 series pricing. As I recall, those cards were universally panned at launch for being obscenely overpriced, and Tom's was universally panned for recommending that people 'just buy it'. Then, when Nvidia largely keeps that pricing scheme for the 30 series, we all flip out like it's the best pro-customer deal ever? I don't get it, but I guess that's the power of a monopoly.
I agree that model number (and pricing) is arbitrary, which is why I beleive the GTX 1650 replaces the 1030 as the 'poor value bottom of the barrel junk that no serious gamer should buy'. The GTX 1650S makes more sense as a replacement in terms of user sentiment.No it doesn't.It's to replace the GTX 1050. I don't categorize by model number because model numbers are arbitrary.
Within the same process, it is always cheaper to have a smaller die. You don't buy chips, you buy wafers. A smaller die means more chips for wafer, and less to lose when there is a flaw.Again, I would also like to point out that it's getting more expensive to go smaller on top of GPU performance scaling with the number of execution units. Die size trends remain largely flat as a result, at least in NVIDIA's designs
A zen3 chiplet is a zen3 chiplet. Soldering it to a motherboard doesn't change the part where AMD is bottlenecked at TSMC. If AMD can't produce enough Processors and GPUs, then I see no reason why they wouldn't also be supply constrained on laptop APUs. Did you buy that laptop within the last month?Hypothetically? I have a laptop with one of those mystical APUs. Also my comment was about system builders using them in desktop systems, not putting a laptop part in a desktop motherboard.
And how many people actually need DP1.4 capabilities? The only thing I can see that it has over older versions is up-to-date support for HDR video standards and 8K. At which point, if I wanted those I'd rather have an HDMI based system because that's plug-and-play with so many other devices. Not to mention if this is for multimedia stuff, even a $50 device can handle 4K HDR.