News NVIDIA GeForce RTX 30 Series GPUs Receive Juicy Price Cuts

Nvidia put that off as long as they possibly could in typical Jensen style. But it is a welcome development all the same. But without DLSS 3 those price cuts should be steaper, assuming Nvidia marketing is to be believed on its true value. Which while it does introduce latency it does work surprisingly well at smoothing your frames out. I can live with things looking like 120-144FPS with 60-72FPS latency in single player/co-op games. I am curious to see AMD implementation in FSR 3.
 
Last edited:
Since 4060 won't come out until July, I doubt we'll see any official price cut until then. This particular markdown is just MSI managing its inventory.

>Seems easy to wait out another round.
>"Intel is full steam ahead on future discrete GPU architectures planned for 2024 (Battlemage) and 2026 (Celestial)."

Waiting another year (or 3 years?) sounds trivial when you are young and don't think about things like your mortality, but when you're older, every year is a significant chunk of your remaining lifespan. So, no, not that easy or trivial.

>*gasp* - Nvidia cards at LESS than MSRP?

I think any markdown would be limited to depleting previous-gen stock. Since Nvidia & AMD are undershipping new stock already (read: no overstock), I don't think there'll be any appreciable markdown on current gen, aside from maybe one-time sales, eg BF.

I still think that 4060 will be a big seller come the holiday, as $300 is the sweet spot for mainstream buyers, and the product itself is "good enough." Enthusiasts may opt for prior-gen products for better bang/buck, but mainstream users would be more reticent in buying older products.
 
  • Like
Reactions: King_V
no DLSS 3.0
not always gonna want 3.0 as it has downsides.
also when was last time seen a modern GPU for sale on Prime day?

But unless they are significantly less than 40 series its still w/e as u can just get a "better" one for similar price.

I still think that 4060 will be a big seller come the holiday, as $300 is the sweet spot for mainstream buyers, and the product itself is "good enough."
the 4060 ti is basically a 50 tier.
non ti is gonna be lucky its a 50 tier and not a 30 tier...
 
I don't think anyone is still willing to buy 8GB GPU at any price, they're clearly obsolete at this point.
8GB cards are a necessity for people like me who refuse to pay more than $250 for a new GPU and I bet there is no shortage of people somewhat like me waiting for something worth buying in the $200-250 range that doesn't look like a reheated performance from 5+ years ago. At a glance, about 50% of entries on the Steam survey have 6GB or less VRAM.

I'm still using a 2GB GTX1050. 8GB with twice the bandwidth would be a massive improvement.
 
At a glance, about 50% of entries on the Steam survey have 6GB or less VRAM.

Yeah I saw that too. Something about 77% of gamers are at 1080p (or less). (720p???)

Hard to believe considering 1080p HD stuff came out like in 2007... and you can get entry level 4K TVs now for a few hundred bucks.

Of course... I'm not a fps junkie... I'm totally happy with 4K Ultra 60 fps. I prefer eye candy over 240 fps 1080p Call of Duty.
 
Hard to believe considering 1080p HD stuff came out like in 2007... and you can get entry level 4K TVs now for a few hundred bucks.
I don't find it hard to believe at all: you cannot comfortably game at resolutions and details higher than what your GPU/IGP can handle and 1080p is pretty much as high as sub-$250 GPUs will go today. Unsurprisingly, sub-$250 GPUs and IGPs are also what about 50% of the Steam survey is on.

For mass adoption of 4k gaming on the desktop to happen, comfortably 4k-capable GPUs need to get much cheaper. Considering how long it took for the $200-ish price point to get a legitimate successor to the RX580, decent 4k under $250 may not happen this decade.
 
I think that the used GPU market is affecting the sales of the same models as new old stock. The new in box GPUs are still way too expensive compared to used. As GPU crypto mining moves away from profitability more and more high powered cards are showing up in the secondary market. If a used RTX 3080 is offered for the same price as a new RTX 3060 Ti the used card is going to lure away many customers.
 
720p is where it's at now, but laptops with 1080p capable IGPs just came out, possible thanks to DDR5

Personally, I am waiting for an Ultrabook sized thin-and-light using the 30w Ryzen 7840U. Its 15w IGP is comparable in 1080p performance to a 250w Kepler GTX Titan or 120w 1060, without the limitation of 6GB of local memory. Intel is well behind as their A380 is both slower and uses 75w so isn't coming to an IGP anytime soon.

The existence of such IGPs would probably destroy the market for low-end cards, and drive down used prices for older cards as soon as AMD gets around to putting them into desktop APUs


These emergency price reductions are of course a response to AMD's RX 7600 at $269 yesterday, but the only interesting card here is the 12GB 3060 for $280. If you can wait, prices are sure to drop again when the 4060 arrives in July
 
The existence of such IGPs would probably destroy the market for low-end cards, and drive down used prices for older cards as soon as AMD gets around to putting them into desktop APUs
I don't think so: the IGP has to share the DDR5 bus with the CPU while an entry-level IGP has the whole 128b/16+Gbps GDDR6 bus all to itself. Having ~3X as much dedicated memory bandwidth has quite a few benefits.
 
  • Like
Reactions: thestryker
So the prices on the RTX 3050/3060 are getting down to what they probably should have been from the start.

On the note of IGPs one of two things would need to happen before they could eliminate sub $200-250 GPUs: 256 bit memory bus, or the MCR DIMM/MRDIMM to become standard.
 
The 6700XT for a few bucks more than the 3060 ti sale price is still a much better deal (though of course overpriced as well). it needs to come down another $40-50 or so to get under the 7600 to be even worth considering. maybe even more.

i don't really see any nvidia card right now worth the price vs what the same money will get from AMD.
 
  • Like
Reactions: King_V
Is 128-bit DDR5 (two 64-bit channels) really a problem for only 1080p and when APUs only have 8 CPU cores? IGP is low-end so only being as fast as the mentioned $90 1060 6GB is pretty OK considering the price. Whereas the cards on sale in the article are definitely midrange, even the 3050--or upcoming 4050, as each have 2560 shaders compared to that IGP's 768 (you know, like a 1050Ti or RX6400 only higher clocked). I only brought it up as Travis seemed genuinely surprised that 77% of PC gamers use 1080p or less

And sure even a low-end dedicated card could be faster, but how much could that sell for vs. something that is essentially free? Low-end cards are already so cost-constrained that they have resorted to a PCIe x4 interface or 96-bit memory bus already, so you have to wonder what sort of corner-cutting will arise with even more pricing-pressure

Look at Blu-Ray vs. DVD. For most people, DVD was good-enough vs. the hassle of endlessly updating DRM and players that fell out of firmware support, so DVD outsold Blu-Ray by a large margin despite 480p. Similarly, the pricing of GPUs is sort of making high-end PC gaming into a niche pastime only for enthusiasts, while the masses have resorted to consoles. IGPs like these can keep PC gaming alive, and are perfect for the casual PC gamer who won't care if a low-end card is slightly faster than their good-enough, free IGP

Yeah, these midrange cards are definitely safe from IGPs until at least DDR6 arrives ~2026 as I don't expect 256-bit systems anytime soon, and by then the 1440p these excel at may be considered pretty low-end (crystal ball predicts in 2027 Travis will be surprised that most gamers still use 1440p when "everybody" is playing at 8k on their 7090Ti)
 
And sure even a low-end dedicated card could be faster, but how much could that sell for vs. something that is essentially free?
How much of a premium for laptops with big-IGP vs the nearest equivalent minimal-IGP SoC? If you want to step up from the 760M to the 780M, you have to go from the Ryzen 7640 to the 7840 and on the Framework laptop, that is $300 extra.

Going from small IGP to bigger IGP is nowhere near free.
Look at Blu-Ray vs. DVD. For most people, DVD was good-enough vs. the hassle of endlessly updating DRM and players that fell out of firmware support, so DVD outsold Blu-Ray by a large margin despite 480p.
DVD looked fine 20+ years ago when most people thought 30" was a large TV and going any bigger was too back-breaking to bother with. Try going back to DVD today where 40" is considered small and UHD is ubiquitous, you will probably think it is borderline unwatchable.

I know I can't enjoy my PS2 games and DVDs on my 50" TV, they look far too grainy.
 
Yeah I saw that too. Something about 77% of gamers are at 1080p (or less). (720p???)

Hard to believe considering 1080p HD stuff came out like in 2007... and you can get entry level 4K TVs now for a few hundred bucks.

Of course... I'm not a fps junkie... I'm totally happy with 4K Ultra 60 fps. I prefer eye candy over 240 fps 1080p Call of Duty.
Physical dimensions have limits, so 1080p is enough for up to 23 inch monitors. Also, 4K screens are cheap enough, but 4K-capable GPUs are a lot more expensive. Heck, my Radeon R9 380 a.k.a. HD 7850 can run Cyberpunk on 1080p high at over 40 fps. 1080p is cheap and visually enough for most people.

And 4K Ultra nowadays requires the very best GPUs, but 1080p Ultra is achievable with $250 cards. :)
 
I don't find it hard to believe at all: you cannot comfortably game at resolutions and details higher than what your GPU/IGP can handle and 1080p is pretty much as high as sub-$250 GPUs will go today. Unsurprisingly, sub-$250 GPUs and IGPs are also what about 50% of the Steam survey is on.

For mass adoption of 4k gaming on the desktop to happen, comfortably 4k-capable GPUs need to get much cheaper. Considering how long it took for the $200-ish price point to get a legitimate successor to the RX580, decent 4k under $250 may not happen this decade.
You just explained it in a way I never could. 👍
 
The VRAM issue has less to do with resolution and more to do with texture detail.

A recent review showed that increasing resolution only marginally increases VRAM usage. Instead, it is texture detail that drastically impacts VRAM usage.

This review was also able to separate memory ALLOCATION and memory USAGE. That MSI afterburner tool is misleading, because it has a difficult time differentiating between allocation and actual usage.

Only about 30% of the increase in VRAM when you change settings is down to resolution, but a whopping 70% is related to texture settings.

(of course, some games automatically increase texture detail when increasing resolution)

4k with medium texture settings actually uses less VRAM in many games than 1080p with textures on high or ultra.

This argument that "X VRAM is fine for X resolution" is a really silly argument, resolution only marginally affects VRAM usage compared to texture detail settings.
 
Last edited:
How much of a premium for laptops with big-IGP vs the nearest equivalent minimal-IGP SoC? If you want to step up from the 760M to the 780M, you have to go from the Ryzen 7640 to the 7840 and on the Framework laptop, that is $300 extra.

Going from small IGP to bigger IGP is nowhere near free.
By the time they get to APUs as I said, they should indeed essentially be free, just as they are now.

For example the largest IGP in an AMD APU is right now 512 shaders in the 5700G, which sells for $200. The closest comparable Ryzen 7 is the 5700X which has no IGP at all, and also sells for $200. While the 5700G has half the L3 and PCIe 3.0 instead of 4.0 you cannot argue that is equivalent to a $300 difference so the 5700X should sell for $500. Even big IGP is low-end
DVD looked fine 20+ years ago when most people thought 30" was a large TV and going any bigger was too back-breaking to bother with. Try going back to DVD today where 40" is considered small and UHD is ubiquitous, you will probably think it is borderline unwatchable.

I know I can't enjoy my PS2 games and DVDs on my 50" TV, they look far too grainy.
This is true, and despite having a 4k OLED, I watch SD content or DVD on a 24" monitor for exactly this reason (my room isn't large enough to sit far enough away from the big TV to make it look OK). I actually only game on the OLED at 1080p as that scales up to 4k/2160p perfectly without any blurriness or scaling artifacts even without DLSS or FSR and looks good enough to me for casual gaming. The point is, times change and while 480p was good enough for most people then, now the vast majority considers 1080p to be good enough. And I maintain it's amazing that you can get 1080p performance out of a laptop IGP already even if it is currently expensive.
 
Only about 30% of the increase in VRAM when you change settings is down to resolution, but a whopping 70% is related to texture settings.
It varies from game to game and even between scenes within a game depending on how heavily shaders that require intermediate buffers are used and what the relevant details settings are set at in options. It also varies depending on what textures get hit hardest with the memory saving squish when lowering textures. If you play at 1080p, which you would likely be in many of the newest titles on a 8GB GPU today, chances are that most textures that get shrunk when going down from max textures to next highest were already too high resolution for typical rendering at 1080p output anyway.
 
>Seems easy to wait out another round.
>"Intel is full steam ahead on future discrete GPU architectures planned for 2024 (Battlemage) and 2026 (Celestial)."

Waiting another year (or 3 years?) sounds trivial when you are young and don't think about things like your mortality, but when you're older, every year is a significant chunk of your remaining lifespan. So, no, not that easy or trivial.

In a year or two I may have matured to the point that I won't be that interested in playing games. In fact the closer I am to the end (And I'm close to halfway by the usual measures) the less I value any time spent in front of a screen.
 
Last edited: