News Don't Buy a Graphics Card for More than $500 Right Now

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I have no recollection of this happening, at least in the US. When I was covering the RTX 20-series launches, the biggest issue was that Nvidia launched Founders Edition cards a month before any other model using the same GPU, and charged a $100 price premium. Coupled with that was a relatively large jump in generational pricing. RTX 2080 Ti at $1,199 replaced the Titan Xp in price and improved performance in games, but without the Titan features — and let's be clear, hardly any gamers buy Titan cards. You couldn't buy a 2080 Ti for under $1200 for most of its life cycle, other than the occasional $999 EVGA RTX 2080 Black that would immediately sell out. It wasn't so much that demand was high, it was that supply was kept low on certain models.

Scalpers, though? I don't remember anything happening there with the 20-series cards, or AMD's later RX 5000-series. Scalpers really only hit the 30-series hard at launch because it was a big jump in generational performance and a drop in pricing. RTX 3080 was ~25% faster than the 2080 Ti for $699 instead of $1199. People were super excited for that. Even the 3090 was attractive to some, at $1499, though supply of 3090 wasn't great at launch. Prices were maybe 30~50% marked up by scalpers in September/October of 2020, which felt awful... until the crypto rush hit and prices basically doubled in a week's period. Also, scalpers blossomed thanks to so many people being out of work or on a forced leave of absence due to Covid.

The only other time I really remember scalpers being an issue prior to 2020 was in late 2017 and early 2018, where we also had a crypto rush. That didn't last more than a few months, however, and but the time RTX 20-series came out with crypto-influenced pricing, things had very much settled down. Prices actually collapsed not long after the 20-series launched, which caused demand from crypto to completely dry up.

scalpers definitely transitioned from just shoes to gpus for the rtx 2000 series. i remember trying to get a 2070 to replace my gtx 1060 at launch and giving up after 3 months of seeing them on stock x and ebay. in the end, i picked up a gtx 1080 from the microcenter a mile down the road from where I live. ampere and rdna 2 and the events surrounding them solidified scalping as a normal thing for newly launched gpus, but with a caveat; this time around bots became prevalent which exacerbated the situation (in conjunction with the mining craze, shortages etc) leading to $1500+ rtx 3080s in November 2020. I very clearly remember thinking to myself that if im going to pay that much, I may as well grab a 3090, which i did at microcenter at msrp on November 11th.

I agree with you that this generation shouldnt be as bad as last generation, however i do expect folks to have difficulties acquiring next gen cards for the first 3-4 months of being on the market. Too many people made too much money scalping last generation to not try it again with this upcoming generation.
 
the idea a new series of cards will bring GPU prices down is comical.

if there is one thing the gpu shortage proved it was that there was a horde of morons ready to drop 2k on a gpu which represented a 10% gain over the next best option at half of that.

the gpu manufacturers saw that and are still adjusting their pricing up in response. expect the "consumer" friendly medium series gpus (x700 for AMD and x60 for nvidia) to be starting at $550 this fall, while the entry level gamers are down in the 250 range with the top end cards all tipping the scale at over $1500

then all the gaming press will run out, breathlessly benchmark these cards against the prior edition comparable model (ignoring the massive price hike) then write copious articles about why you must buy these now!!!

it's all so tiresome. the whole 3xxx series was essentially a paper launch, no one saw a new one of them for the whole run for the most part. excepting that breathless minority who will be first in line to waste 1500 on the next great gpu.
 
  • Like
Reactions: egda23 and Tac 25
if there is one thing the gpu shortage proved it was that there was a horde of morons ready to drop 2k on a gpu which represented a 10% gain over the next best option at half of that.
Most people paying exorbitant premiums were crypto miners expecting to still make a profit no matter how much they had to pay to score GPUs.

Crypto has crashed, the ETH merge is scheduled for four days from now, that infinite demand has dried up until further notice, now AMD and Nvidia are stuck with excess GPU stock and prices are still dropping on most models because people are either waiting out next-gen or current prices fail to deliver sufficient performance per dollar for the remaining potential buyers to generate sales in the current market.
 
  • Like
Reactions: aalkjsdflkj
@
the idea a new series of cards will bring GPU prices down is comical.

if there is one thing the gpu shortage proved it was that there was a horde of morons ready to drop 2k on a gpu which represented a 10% gain over the next best option at half of that.

not 2k dollars over here, but I did see evidence of someone buying a GTX 1660 Super for 600 dollars during the craziness of the gpu crisis in my country. I was window shopping at stores.. most gpu on display for sale are GT 1030 and GT 730, there's one (just one left) 1660 Super on display at a store that I'm on good terms with... some weeks later, that lone 1660 Super on display was bought by someone. I guess that person really wanted a more powerful gpu. lol
 
Last edited:
now, that infinite demand has dried up until further notice, now AMD and Nvidia are stuck with excess GPU stock and prices are still dropping on most models
I'll believe this when i can see a current gen gpu priced under MSRP. right now all the retailers are claiming their gpus are on sale when in actuality they're being sold at or slightly more then msrp. that's not "crashing" prices... maybe from their peak but obviously there is enough demand for the leftover stock that no one feels it necessary to price them under msrp yet.
 
  • Like
Reactions: martinch
right now all the retailers are claiming their gpus are on sale when in actuality they're being sold at or slightly more then msrp. that's not "crashing" prices... maybe from their peak but obviously there is enough demand for the leftover stock that no one feels it necessary to price them under msrp yet.
The RX6400-6600 are routinely available under MSRP at least in North America.

For higher-end stuff, AMD and Nvidia raised their back-end prices along the way, AIBs bought GPU kits at inflated prices and passed those off to retail at greatly inflated prices, now retail prices are hung on those inflated prices and nobody wants to be first to trigger the race to the bottom without some sort of rebate program to soften the blow to their margins.
 
The title should say... "Don't Buy a Graphics Card for more than $500 ANYTIME", by the time these NEW series are available at a reasonable price, the next series will be announced to be coming SOON!! So that means you never stop kicking yourself. These companies are Greedy Hungry and have bottomless pockets, it's all shady tactics to get you to open your wallet wide, this merry go round will never end, they know people are stupid and they will pay those high prices, just like smart phones, why don't we just wait for the Holodeck, LOL.
 
I have an older GPU, but currently it is one of the last things I plan to upgrade. Right now I am waiting for the release of the newest AMD and Intel chipsets and CPUs. And then likely with a setup having DDR5 and PCIe 5.0 for M.2 SSD, whichever GPU I plop in the next years, I should be able to make full use of it. In the meantime, I still have a bit of a gaming backlog...
 
It would be nice to see a series of articles that benchmark Intel's card after each driver update to see how they are progressing. We know the drivers were garbage when the card was released, but have they improved any since then?

Edit

I just checked Intel's website and they've released 6 drivers with the latest being 9/2/22. There is definitely some article material here.
 
Last edited:
Yeah, I think Jay is probably right and a lot of people are going to be disappointed when NVIDIA announces their high-end GPUs that don't replace the existing cards but are priced around them and slotted in between them. The existing cards are going to be around awhile, Jensen said as much.
 
Was going to reply back when this was posted but got busy. This is bad advice, if you need a new GPU then buy a new GPU, do not wait. Nvidia and their AIB partners has already strategized around the current stock situation and will not be bringing prices down on current inventory. The plan to have both 30 and 40 series available simultaneously and they will new be flooding the market with "cheap" 40 series GPU's. Instead we can expect that the only 40 series "available" will be the ultra high end while they let the current 30's drain from the supply channel. Worst case is nVidia loses some market share to AMD this round, which they seem to have already decided is an acceptable outcome.

Jay's review of nVidia's earnings call is spot on, this happens every time there is a "new" generation. People always say "wait, don't buy now, product X is right around the corner and is 1 million times better" and it never actually works out that way. Nvidia and AMD don't create discrete cards that we buy, the just design the chip and reference boards and then pay TSMC to manufacture the chips that they sell to AIB makers. For years now there was no supply for cards resulting in a massive backlog of demand, with the crypto crash that just happened there is finally supply available to meet that demand and they will let that demand drain the current channel before unloading another product into the channel. Prices aren't racing to the bottom, they are just returning back to pre-crypto insanity, it only seems like they are crashing because everyone got used to seeing insanely inflated prices the past few years.
 
Last edited:
All I know is my old 775 and GTX 295 is way overdue for replacement. There is no way I would have paid $2,000-$2,500 before TTL for a new card a few months ago...and I still had to swallow hard to pay $1,200 TTL a week and a half ago. Truth is though, I had to swallow hard to pay over $500 for the GTX 295 14 years ago. Ya'll do what you want. 😉
 
  • Like
Reactions: martinch and Tac 25
Well a lot of people out there like to get bigger and higher resolution monitors (and higher graphics quality settings in game). If you like being imprisoned to the $500 GPU world (as if inflation never happens like why we can't buy a Big Mac for $2.99 USD anymore, among other things like raw materials cost increases), then by all means enjoy the gaming world they provide for you. But others think differently.

By the way: in gaming at 4K resolution, that 30% more performance increase you reference may mean the difference between nailing a solid 60FPS and not, or running full quality settings or not, or successfully upgrading to 4K from 1440p or not, and for a lot of people out there that is important. Or in the case of an RTX 3080, will be 100% faster (double the FPS) than your RTX 2060.
Alot of people out there, are imprisoned in just keeping up with making ends meet and on budgets, and can't afford those high a$$ prices just to play a game, alot of people don't have the privilege to get a collage degree, or a great job, your comment sounds just what a rich well off person would say and thinks, they only care about their own needs and wants, skies the limit for the well off and privileged, you should come down here to humble yourself, come have some rice and beans and be happy to at least play at 1080p.
 
  • Like
Reactions: martinch
  1. I said if you need something right now, then stick with an interim purchase closer to $250.
  2. Toss out the past two years, because it's a glaring exception to the rule of how new GPU launches go. Historically, RTX 2070 basically matched GTX 1080 Ti performance for less money, less power, and more features. RTX 3070 basically matched RTX 2080 Ti performance while using less power, and the MSRP was half as high. GTX 1070 beat the GTX 980 Ti for a significantly lower price. And GTX 970 was more or less tied with GTX 780 Ti performance. You're looking at "history" of the past 24 months and I'm looking at every other GPU launch going back more than a decade.
  3. Buying with the intent to return is not something I condone. It's dishonest at best. Find someone with an upgrade policy if that's what you really want, but you'll pay for that as well.
  4. The only "50% discount" right now is on the RTX 3090 Ti, which was stupidly expensive at launch and even at half off is still way more than most people should spend on a GPU. All the reports of excess inventory mean such discounts are happening because Nvidia and its partners already know what's coming down the pipeline. RTX 3090 Ti is most likely being sold for as little as $1,079 at retail because Nvidia intends to launch a faster card at a lower price in the near future. I would be shocked if RTX 4080 debuts at more than a $999 MSRP, and possibly it will be less than that, and I still wager performance will exceed the RTX 3090 Ti.
  5. Bots will be around for scalpers to try, but probably 95% of people that paid scalper prices in 2021 did so because of cryptocurrency mining. Maybe they gamed on the side as well, but they were certainly hoping to offset the initial cost. When Ethereum undergoes The Merge (slated for next week), GPU mining profitability will drop by 50%. Who's going to buy a $1,000 GPU to make $1 per day? Because if that sounds enticing, why aren't all the miners already buying up all the RTX 3090 Ti cards that currently make $2 per day?
Latest rumors are now stating two different 4080 gpu's, one w/12g vram at 280 watts and one at16g vram at 320watts.
Leaks for the last few months have indicated a 320w 16g 4080 at 30% faster than the 3090ti.
Now how much do you want to bet that the 12g vanilla 4080 will be around equal to 5-10% faster at most to 3090ti speeds?

Also leaks have speculated that the 4080 would launch at $849 U.S.D. which surely will be the 12g slower model.
I bet the 16g faster model will launch at least in the $1200 range.

Now on to the 3090ti, available now for $1099 and has 24g vram.
My bet is that these cards are coming in and out of stock as the manufacturers are only producing them now in batches they feel they can sell through fairly quickly.
Could the price on them drop more possibly, maybe another $100 but not any lower and production on these cards will cease entirely by the launch of the 4080 IF the manufacturers do not exhaust their stock of 30 series chips that will bin high enough to produce these cards before hand.

Why do I say that, because as far as making money under a certain price point makes no business sense to build a card with 24g of expensive xvram and money wise if the price is dropping it would make more sense to build 12g 3080ti models which would carry a much lower production cost hence less of a loss dropping the chip to a lower binned product rather than basically giving away 12g of vram.

As far as the 3080 cards if the vanilla 4080 is $849-$899 and about as fast as a 3090 then there will be no reason to drop the 3080 cards below about $650 or so.

My prediction is unless you are okay with the 4080 card with only 12g vram at around 3090ti speeds or do not mind spending about $1200 for the faster 4080 then if you feel as a middle of the road settlement of the 3090ti speed level but with plenty of vram where that will never be an issue then I would not wait too long if I wanted a NEW 3090ti card.

I really think the price is going to hold fairly steady and this card will be discontinued before the 4080 launch or they will run out of chips capable of building this tier card and the last samples will see a price increase before they are gone for good.

This is my opinion just using sensible business tactics for a changing market and new product release.
 
All I know is my old 775 and GTX 295 is way overdue for replacement. There is no way I would have paid $2,000-$2,500 before TTL for a new card a few months ago...and I still had to swallow hard to pay $1,200 TTL a week and a half ago. Truth is though, I had to swallow hard to pay over $500 for the GTX 295 14 years ago. Ya'll do what you want. 😉

GTX 295 is a strong card during the era that you purchased it, and probably gamed just fine for at least 4 or 5 years afterwards. I think you already got your money's worth for that gpu.

a gpu will slowly become outdated as time goes on. That's just the way it is. This new RTX 3050 I purchased for $320, I expect it to give 1080p gaming for at least 3 years, I would already be satisfied if it serves for that long.
 
Last edited:
  • Like
Reactions: martinch
GTX 295 is a strong card during the era that you purchased it, and probably gamed just fine for at least 4 or 5 years afterwards. I think you already got your money's worth for that gpu.

a gpu will slowly become outdated as time goes on. That's just the way it is. This new RTX 3050 I purchased for $320, I expect it to give 1080p gaming for at least 3 years, I would already be satisfied if it serves for that long.


But, time does not stand still in the computer world. This is how my old box rates now. https://www.userbenchmark.com/UserRun/54467519

Excited to put my new rig together this weekend or early next week when all the parts are in. Already called and have my IT Buddy on ready standby to put the icing on this cake. https://pcpartpicker.com/list/zCn2Bj
 
@cAllen

yeah, time does not stand still in computer world. If it stood still, we won't be getting better games with better graphics. lol
as I said on my previous post. GPU will slowly become outdated as time goes on. You already played a lot of games with that GTX 295 during it's prime years, so your 500 dollars is worth it. Having low ratings is normal.. your old rig is an lga 775 after all, these things are retirement age already. My lga 775 is still in service, though not for gaming - instead it's another backup storage for data, it still has use even if old.

goodluck putting together your new pc. I hope it lasts you another 10 years. :)
 
Latest rumors are now stating two different 4080 gpu's, one w/12g vram at 280 watts and one at16g vram at 320watts.
Leaks for the last few months have indicated a 320w 16g 4080 at 30% faster than the 3090ti.
Now how much do you want to bet that the 12g vanilla 4080 will be around equal to 5-10% faster at most to 3090ti speeds?
I find it simply ludicrous that Nvidia would introduce a 4080 with 16GB and a 256-bit interface with more CUDA cores, and then have a difference PCB, different chip probably as well, with 12GB and a 192-bit interface. The latest rumors are now saying exactly what I said when someone first floated this "leak:" the 12GB card will be the RTX 4070, the 16GB is RTX 4080. Of course, the 12GB card and GPU will probably also end up as the top mobile solution and get called the RTX 4080 for laptops (but with desktop RTX 4060 performance due to power constraints).
 
I find it simply ludicrous that Nvidia would introduce a 4080 with 16GB and a 256-bit interface with more CUDA cores, and then have a difference PCB, different chip probably as well, with 12GB and a 192-bit interface. The latest rumors are now saying exactly what I said when someone first floated this "leak:" the 12GB card will be the RTX 4070, the 16GB is RTX 4080. Of course, the 12GB card and GPU will probably also end up as the top mobile solution and get called the RTX 4080 for laptops (but with desktop RTX 4060 performance due to power constraints).
But there were also differences in the 3080 10 and 12 g models with the cuda cores, tensor cores, RT cores, memory bus and bandwidth.
Not knowing how well the yields are doing in producing the percentage of chips capable of use for the highest tier cards if problematic could very well be forcing a more down to earth 4080 model.
And also with production cost rumored to be high on these chips along with the cost of 16g of the xvram it is also possible to produce a card that can have a satisfactory profit margin and still slot in under a $950 or less price point production cost have to be cut somewhere and this may be the configuration that will meet at least 3090ti specs and be classed as a 4080 and still maintain profit margins at the lower price point.
Will be interesting to see what actually is produced.
I really do think though that Nvidia wants a 4080 model priced under $1000 and I just question whether the full hilt 16g card can hit that price point while still making the targeted profit margins.
 
But there were also differences in the 3080 10 and 12 g models with the cuda cores, tensor cores, RT cores, memory bus and bandwidth.
Not knowing how well the yields are doing in producing the percentage of chips capable of use for the highest tier cards if problematic could very well be forcing a more down to earth 4080 model.
And also with production cost rumored to be high on these chips along with the cost of 16g of the xvram it is also possible to produce a card that can have a satisfactory profit margin and still slot in under a $950 or less price point production cost have to be cut somewhere and this may be the configuration that will meet at least 3090ti specs and be classed as a 4080 and still maintain profit margins at the lower price point.
Will be interesting to see what actually is produced.
I really do think though that Nvidia wants a 4080 model priced under $1000 and I just question whether the full hilt 16g card can hit that price point while still making the targeted profit margins.
The thing is, other model numbers make far more sense than simply having "RTX 4080 16GB" and "RTX 4080 12GB" — because a 25% reduction in memory interface width and a different PCB along with changes in core clocks and memory speeds definitely does not constitute a minor change. RTX 3080 10GB for reference has just a 17% reduction compared to RTX 3080 12GB, plus it was a late arrival mostly targeted at cryptocurrency miners AFAICT. (Yeah, I think the Nvidia mining lock was broken for big miners a long time ago, possibly via some form of driver hack.)

Nvidia has done GTX 1060 6GB and GTX 1060 3GB, but those had the same memory interface, just a change in capacity, and the 3GB card had 10% fewer GPU cores. By comparison, we're looking at rumors of 4080 16GB at 23Gbps with 9728 GPU cores, versus RTX 4080 12GB at 21Gbps with 7680 GPU cores. That's 25% less memory, 32% less memory bandwidth, and 21% fewer GPU cores — all just to save 55W, according to the latest Kopite7Kimi post. Either Kopite is off his rocker, or Nvidia is going to get ripped apart by reviewers for confusing model names.

Why not just call everything by the GPU, cores, clocks, RAM, and bandwidth instead? I mean, I could get behind that:
GeForce RTX AD102 9728 2.5GHz 16GB 736GBps vs. GeForce RTX AD103 7680 2.6GHz 12GB 504GBps 🤣
 
The thing is, other model numbers make far more sense than simply having "RTX 4080 16GB" and "RTX 4080 12GB" — because a 25% reduction in memory interface width and a different PCB along with changes in core clocks and memory speeds definitely does not constitute a minor change. RTX 3080 10GB for reference has just a 17% reduction compared to RTX 3080 12GB, plus it was a late arrival mostly targeted at cryptocurrency miners AFAICT. (Yeah, I think the Nvidia mining lock was broken for big miners a long time ago, possibly via some form of driver hack.)

Nvidia has done GTX 1060 6GB and GTX 1060 3GB, but those had the same memory interface, just a change in capacity, and the 3GB card had 10% fewer GPU cores. By comparison, we're looking at rumors of 4080 16GB at 23Gbps with 9728 GPU cores, versus RTX 4080 12GB at 21Gbps with 7680 GPU cores. That's 25% less memory, 32% less memory bandwidth, and 21% fewer GPU cores — all just to save 55W, according to the latest Kopite7Kimi post. Either Kopite is off his rocker, or Nvidia is going to get ripped apart by reviewers for confusing model names.

Why not just call everything by the GPU, cores, clocks, RAM, and bandwidth instead? I mean, I could get behind that:
GeForce RTX AD102 9728 2.5GHz 16GB 736GBps vs. GeForce RTX AD103 7680 2.6GHz 12GB 504GBps 🤣
I have little doubt that this was originally slated to be a 4070 class card but the problem is it performed too good to be sold at a 4070 price point and not affect the perceived value of the price/performance of the 4080 and would hurt the sales numbers for the 80 series card.

So easy solution call the card a 12g 4080 and charge a higher price for it which also allows the opportunity to raise the price of the 16g 4080 to a higher than originally anticipated price point.
Then come out with a card that is roughly the performance level of the 3080ti and call that card the 4070 and slot it in at the 70 tier price point.

All across the board this is raising the profit margins on lower binned chips than originally planned which is a win for Nvidia in their eyes with the expected lower sales volume.

EDIT*** Biggest thing this could mark the beginning of a cards name or class is strictly determined by its performance level within the product stack and have nothing to do with the components or specs used to build said card.
 
Last edited: