• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

News Best Buy Heavily Discounts Nvidia RTX 30 Graphics Card Stocks

I hope this becomes a trend. GPU prices are way too high.

Hoping to see big discounts on 30-series laptops soon too. I've been watching a 2-in-1 device at Best Buy that's 25% more than I'm willing to spend. With the paltry specs of the 40-series in the 2023 model, waiting and paying a higher price no longer makes sense.
 
The 40-series launched at horrific price points, makes sense that offering great deals on 30-series stuff would make them vanish from inventory.

Might be a sign of Nvidia testing the market to see whether pricing really is the reason for slow sales.
It's worth noting that nowhere else I checked had prices anywhere remotely close to Best Buy's. So, I'd say you shouldn't over-generalize this event. I don't know what possessed them to do such heavy discounts, but it sure seems like they didn't need to drop the prices anywhere near as much to clear that inventory.

I checked after seeing this posted on another site, before this article even showed up on Toms, and the inventory was already cleared out.

Best time to get a 30 series card.
You'd think so, but these deals might've been just a promotional stunt. Not to say 30-series prices aren't down, but nothing like these were.
 
Last edited:
I wouldn't buy a 30 series card, with their low v-ram buffers. Hogwarts Legacy is just the beginning of not having enough v-ram, on those cards.
Not that I personally care (not a gamer), but it might help others if you'd offer your advice for how much memory you think they should aim for.

One of the models sold at a big discount was the RTX 3080 that has 10 GB. At $420, it was a steal. The RTX 4070 Ti stands only a little better, at 12 GB. Is 10 GB bad, but 12 GB good?
 
Suspecting that Nvidia may be testing the market by telling Best Buy (its only official retail partner for FE cards) to do this sale
Could be, but I think that's a much more extreme price cut than what Nvidia would actually do, in practice. Especially if you look around at current 3000-series inventory levels, it's not obvious to me there's a problem necessitating such a drastic remedy.

If Nvidia did basically eat their margins on 3000-series, it could buy them a decent amount of good will among the gaming community. The growing AI boom could give them the confidence and wherewithal to do such a move, but it does seem a little out-of-character.

BTW, their quarterly financials are due, any day now.
 
While we're spitballing... what if it was an anti-competitive move by Best Buy, to temporarily stifle RTX 3000 purchases from their competitors, as consumers all hold their breath to see if anyone else will make a similar move?

Maybe Best Buy already had such low inventory on those cards that the total hit to their bottom line was just a rounding error?
 
I don't know what possessed them to do such heavy discounts, but it sure seems like they didn't need to drop the prices anywhere near as much to clear that inventory.

From What I remember back when I worked Geek Squad, Best Buy has end of life agreements with some of its vendors and manufacturers. Once a SKU reaches the EOL date, a store pays for every day the old stock still exists. I imagine Best Buys warehouses work similarly. This can become costly very quickly if enough old stock still exists so a heavy discount to clear it out may be a loss, but less of a loss than holding onto them and racking up penalties from Nvidia. Manufactures like Nvidia do not want old stock sitting next to new, and this ensures it doesn’t happen. Board partners such as PNY, EVGA, MSI, ex…. Didn’t seem to have an EOL cut off, or at least they seemed to have a much longer time line as their cards would sit around much longer in the past.
 
Man I wish I had known, because I would have happily taken one off their hands. I'll definitely keep an eye out for one in the future.
Everyone is probably now on high-alert. You'd probably need to have a realtime notification of some sort, if you have any hope of catching another one of these deals. They'll sell out faster than Taylor Swift tickets.
 
Not that I personally care (not a gamer), but it might help others if you'd offer your advice for how much memory you think they should aim for.

One of the models sold at a big discount was the RTX 3080 that has 10 GB. At $420, it was a steal. The RTX 4070 Ti stands only a little better, at 12 GB. Is 10 GB bad, but 12 GB good?

At least according to this review, 10GB is barely enough for 1080p in Hogwarts Legacy.
 
  • Like
Reactions: bit_user
Everyone is probably now on high-alert. You'd probably need to have a realtime notification of some sort, if you have any hope of catching another one of these deals. They'll sell out faster than Taylor Swift tickets.

Yeah I know, right? I just bought a 3060TI for one system and I'll probably be buying something a bit higher end for my SFF system (mainly because I want to get the most from my new TV). A 3080 at that price would be an absolute steal.
 
  • Like
Reactions: bit_user

At least according to this review, 10GB is barely enough for 1080p in Hogwarts Legacy.
Ahh, that's the problem, turn off ray tracing, and apply some fixes. I have no issues beyond the ones everyone experience at 1440p.
 
  • Like
Reactions: bit_user
Ahh, that's the problem, turn off ray tracing, and apply some fixes. I have no issues beyond the ones everyone experience at 1440p.
I kinda miss gaming in the 1990s, when they couldn't push out updates over Steam, so games just were optimized & tested BEFORE release. Now about half of games don't even work at launch.

EDIT: That's probably what was so great about Windows XP. It worked when it was shipped out initially. No updates every couple weeks that keep causing problems.
 
I kinda miss gaming in the 1990s, when they couldn't push out updates over Steam, so games just were optimized & tested BEFORE release. Now about half of games don't even work at launch.

EDIT: That's probably what was so great about Windows XP. It worked when it was shipped out initially. No updates every couple weeks that keep causing problems.
Remember when "DLC" started becoming the norm? That was the beginning of the end in my opinion. Devs can't even finish the game before producers release them, they gotta sell you the other half the game in segments over 2 years after the initial release...
 
  • Like
Reactions: Why_Me and bit_user
I kinda miss gaming in the 1990s, when they couldn't push out updates over Steam, so games just were optimized & tested BEFORE release. Now about half of games don't even work at launch.
But when games had bugs, you had to wait months and then pray for an update disk to be released.

I remember going through that with at least two games (and probably others): Ultima 7 and Darklands. And Ultima 7 already launched like a year late.

EDIT: That's probably what was so great about Windows XP. It worked when it was shipped out initially. No updates every couple weeks that keep causing problems.
LOL, seriously? I remember downloading & installing Windows patches way back in the late 90's. It just wasn't as streamlined as it is now.

I think the Windows Update service didn't come along until later in the life of Windows XP, but you could point Internet Explorer at Microsoft's Windows Update website and it'd run some ActiveX plugin that scanned your PC and told you which updates you needed.
 
But when games had bugs, you had to wait months and then pray for an update disk to be released.

I remember going through that with at least two games (and probably others): Ultima 7 and Darklands. And Ultima 7 already launched like a year late.


LOL, seriously? I remember downloading & installing Windows patches way back in the late 90's. It just wasn't as streamlined as it is now.

I think the Windows Update service didn't come along until later in the life of Windows XP, but you could point Internet Explorer at Microsoft's Windows Update website and it'd run some ActiveX plugin that scanned your PC and told you which updates you needed.
You're right that some games had bugs. But intentionally releasing things with bugs and not beta-testing is worse.

I updated Windows 98 online in the late 90s, before XP, yes. But when they released an update, it didn't CAUSE blue screens. Now Microsoft introduces problems to things that were already working. Plus, nobody can tell you how to fix things on Windows 10/11 because they move around and replace all menus/settings every couple months. So the how-to guide is always a couple months out of date.

I just don't want to be sold beta software, which is the norm today. And I don't want pushed updates that break what's already working.
 
  • Like
Reactions: Why_Me