This article is very reminiscent of 2018 "Just Buy It" LMAO!!! Just Buy It: Why Nvidia RTX GPUs Are Worth the Money | Tom's Hardware (tomshardware.com)
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
I have no recollection of this happening, at least in the US. When I was covering the RTX 20-series launches, the biggest issue was that Nvidia launched Founders Edition cards a month before any other model using the same GPU, and charged a $100 price premium. Coupled with that was a relatively large jump in generational pricing. RTX 2080 Ti at $1,199 replaced the Titan Xp in price and improved performance in games, but without the Titan features — and let's be clear, hardly any gamers buy Titan cards. You couldn't buy a 2080 Ti for under $1200 for most of its life cycle, other than the occasional $999 EVGA RTX 2080 Black that would immediately sell out. It wasn't so much that demand was high, it was that supply was kept low on certain models.
Scalpers, though? I don't remember anything happening there with the 20-series cards, or AMD's later RX 5000-series. Scalpers really only hit the 30-series hard at launch because it was a big jump in generational performance and a drop in pricing. RTX 3080 was ~25% faster than the 2080 Ti for $699 instead of $1199. People were super excited for that. Even the 3090 was attractive to some, at $1499, though supply of 3090 wasn't great at launch. Prices were maybe 30~50% marked up by scalpers in September/October of 2020, which felt awful... until the crypto rush hit and prices basically doubled in a week's period. Also, scalpers blossomed thanks to so many people being out of work or on a forced leave of absence due to Covid.
The only other time I really remember scalpers being an issue prior to 2020 was in late 2017 and early 2018, where we also had a crypto rush. That didn't last more than a few months, however, and but the time RTX 20-series came out with crypto-influenced pricing, things had very much settled down. Prices actually collapsed not long after the 20-series launched, which caused demand from crypto to completely dry up.
Most people paying exorbitant premiums were crypto miners expecting to still make a profit no matter how much they had to pay to score GPUs.if there is one thing the gpu shortage proved it was that there was a horde of morons ready to drop 2k on a gpu which represented a 10% gain over the next best option at half of that.
the idea a new series of cards will bring GPU prices down is comical.
if there is one thing the gpu shortage proved it was that there was a horde of morons ready to drop 2k on a gpu which represented a 10% gain over the next best option at half of that.
I'll believe this when i can see a current gen gpu priced under MSRP. right now all the retailers are claiming their gpus are on sale when in actuality they're being sold at or slightly more then msrp. that's not "crashing" prices... maybe from their peak but obviously there is enough demand for the leftover stock that no one feels it necessary to price them under msrp yet.now, that infinite demand has dried up until further notice, now AMD and Nvidia are stuck with excess GPU stock and prices are still dropping on most models
The RX6400-6600 are routinely available under MSRP at least in North America.right now all the retailers are claiming their gpus are on sale when in actuality they're being sold at or slightly more then msrp. that's not "crashing" prices... maybe from their peak but obviously there is enough demand for the leftover stock that no one feels it necessary to price them under msrp yet.
Alot of people out there, are imprisoned in just keeping up with making ends meet and on budgets, and can't afford those high a$$ prices just to play a game, alot of people don't have the privilege to get a collage degree, or a great job, your comment sounds just what a rich well off person would say and thinks, they only care about their own needs and wants, skies the limit for the well off and privileged, you should come down here to humble yourself, come have some rice and beans and be happy to at least play at 1080p.Well a lot of people out there like to get bigger and higher resolution monitors (and higher graphics quality settings in game). If you like being imprisoned to the $500 GPU world (as if inflation never happens like why we can't buy a Big Mac for $2.99 USD anymore, among other things like raw materials cost increases), then by all means enjoy the gaming world they provide for you. But others think differently.
By the way: in gaming at 4K resolution, that 30% more performance increase you reference may mean the difference between nailing a solid 60FPS and not, or running full quality settings or not, or successfully upgrading to 4K from 1440p or not, and for a lot of people out there that is important. Or in the case of an RTX 3080, will be 100% faster (double the FPS) than your RTX 2060.
Latest rumors are now stating two different 4080 gpu's, one w/12g vram at 280 watts and one at16g vram at 320watts.
- I said if you need something right now, then stick with an interim purchase closer to $250.
- Toss out the past two years, because it's a glaring exception to the rule of how new GPU launches go. Historically, RTX 2070 basically matched GTX 1080 Ti performance for less money, less power, and more features. RTX 3070 basically matched RTX 2080 Ti performance while using less power, and the MSRP was half as high. GTX 1070 beat the GTX 980 Ti for a significantly lower price. And GTX 970 was more or less tied with GTX 780 Ti performance. You're looking at "history" of the past 24 months and I'm looking at every other GPU launch going back more than a decade.
- Buying with the intent to return is not something I condone. It's dishonest at best. Find someone with an upgrade policy if that's what you really want, but you'll pay for that as well.
- The only "50% discount" right now is on the RTX 3090 Ti, which was stupidly expensive at launch and even at half off is still way more than most people should spend on a GPU. All the reports of excess inventory mean such discounts are happening because Nvidia and its partners already know what's coming down the pipeline. RTX 3090 Ti is most likely being sold for as little as $1,079 at retail because Nvidia intends to launch a faster card at a lower price in the near future. I would be shocked if RTX 4080 debuts at more than a $999 MSRP, and possibly it will be less than that, and I still wager performance will exceed the RTX 3090 Ti.
- Bots will be around for scalpers to try, but probably 95% of people that paid scalper prices in 2021 did so because of cryptocurrency mining. Maybe they gamed on the side as well, but they were certainly hoping to offset the initial cost. When Ethereum undergoes The Merge (slated for next week), GPU mining profitability will drop by 50%. Who's going to buy a $1,000 GPU to make $1 per day? Because if that sounds enticing, why aren't all the miners already buying up all the RTX 3090 Ti cards that currently make $2 per day?
All I know is my old 775 and GTX 295 is way overdue for replacement. There is no way I would have paid $2,000-$2,500 before TTL for a new card a few months ago...and I still had to swallow hard to pay $1,200 TTL a week and a half ago. Truth is though, I had to swallow hard to pay over $500 for the GTX 295 14 years ago. Ya'll do what you want. 😉
GTX 295 is a strong card during the era that you purchased it, and probably gamed just fine for at least 4 or 5 years afterwards. I think you already got your money's worth for that gpu.
a gpu will slowly become outdated as time goes on. That's just the way it is. This new RTX 3050 I purchased for $320, I expect it to give 1080p gaming for at least 3 years, I would already be satisfied if it serves for that long.
I find it simply ludicrous that Nvidia would introduce a 4080 with 16GB and a 256-bit interface with more CUDA cores, and then have a difference PCB, different chip probably as well, with 12GB and a 192-bit interface. The latest rumors are now saying exactly what I said when someone first floated this "leak:" the 12GB card will be the RTX 4070, the 16GB is RTX 4080. Of course, the 12GB card and GPU will probably also end up as the top mobile solution and get called the RTX 4080 for laptops (but with desktop RTX 4060 performance due to power constraints).Latest rumors are now stating two different 4080 gpu's, one w/12g vram at 280 watts and one at16g vram at 320watts.
Leaks for the last few months have indicated a 320w 16g 4080 at 30% faster than the 3090ti.
Now how much do you want to bet that the 12g vanilla 4080 will be around equal to 5-10% faster at most to 3090ti speeds?
But there were also differences in the 3080 10 and 12 g models with the cuda cores, tensor cores, RT cores, memory bus and bandwidth.I find it simply ludicrous that Nvidia would introduce a 4080 with 16GB and a 256-bit interface with more CUDA cores, and then have a difference PCB, different chip probably as well, with 12GB and a 192-bit interface. The latest rumors are now saying exactly what I said when someone first floated this "leak:" the 12GB card will be the RTX 4070, the 16GB is RTX 4080. Of course, the 12GB card and GPU will probably also end up as the top mobile solution and get called the RTX 4080 for laptops (but with desktop RTX 4060 performance due to power constraints).
The thing is, other model numbers make far more sense than simply having "RTX 4080 16GB" and "RTX 4080 12GB" — because a 25% reduction in memory interface width and a different PCB along with changes in core clocks and memory speeds definitely does not constitute a minor change. RTX 3080 10GB for reference has just a 17% reduction compared to RTX 3080 12GB, plus it was a late arrival mostly targeted at cryptocurrency miners AFAICT. (Yeah, I think the Nvidia mining lock was broken for big miners a long time ago, possibly via some form of driver hack.)But there were also differences in the 3080 10 and 12 g models with the cuda cores, tensor cores, RT cores, memory bus and bandwidth.
Not knowing how well the yields are doing in producing the percentage of chips capable of use for the highest tier cards if problematic could very well be forcing a more down to earth 4080 model.
And also with production cost rumored to be high on these chips along with the cost of 16g of the xvram it is also possible to produce a card that can have a satisfactory profit margin and still slot in under a $950 or less price point production cost have to be cut somewhere and this may be the configuration that will meet at least 3090ti specs and be classed as a 4080 and still maintain profit margins at the lower price point.
Will be interesting to see what actually is produced.
I really do think though that Nvidia wants a 4080 model priced under $1000 and I just question whether the full hilt 16g card can hit that price point while still making the targeted profit margins.
I have little doubt that this was originally slated to be a 4070 class card but the problem is it performed too good to be sold at a 4070 price point and not affect the perceived value of the price/performance of the 4080 and would hurt the sales numbers for the 80 series card.The thing is, other model numbers make far more sense than simply having "RTX 4080 16GB" and "RTX 4080 12GB" — because a 25% reduction in memory interface width and a different PCB along with changes in core clocks and memory speeds definitely does not constitute a minor change. RTX 3080 10GB for reference has just a 17% reduction compared to RTX 3080 12GB, plus it was a late arrival mostly targeted at cryptocurrency miners AFAICT. (Yeah, I think the Nvidia mining lock was broken for big miners a long time ago, possibly via some form of driver hack.)
Nvidia has done GTX 1060 6GB and GTX 1060 3GB, but those had the same memory interface, just a change in capacity, and the 3GB card had 10% fewer GPU cores. By comparison, we're looking at rumors of 4080 16GB at 23Gbps with 9728 GPU cores, versus RTX 4080 12GB at 21Gbps with 7680 GPU cores. That's 25% less memory, 32% less memory bandwidth, and 21% fewer GPU cores — all just to save 55W, according to the latest Kopite7Kimi post. Either Kopite is off his rocker, or Nvidia is going to get ripped apart by reviewers for confusing model names.
Why not just call everything by the GPU, cores, clocks, RAM, and bandwidth instead? I mean, I could get behind that:
GeForce RTX AD102 9728 2.5GHz 16GB 736GBps vs. GeForce RTX AD103 7680 2.6GHz 12GB 504GBps 🤣