Right Now Is a Terrible Time to Buy a Graphics Card

Stephen_144

Reputable
Feb 22, 2017
13
1
4,510
I'm in the market and I'm doing my best to hold off;

How long do you think it will be until we can get a desktop with?;

Intel 12th gen k-series 10nm CPU (alder lake)
MB with w/PCI4.0 full support.
CPU and MB to support DDR5 RAM
RTX 3080 Super (PCI 4.0)
PCI 4.0 SSD


I'm guessing this time next year? What do you guys think?
 
  • Like
Reactions: Blaze Productionz

JarredWaltonGPU

Senior GPU Editor
Editor
I'm in the market and I'm doing my best to hold off;

How long do you think it will be until we can get a desktop with?;

Intel 12th gen k-series 10nm CPU (alder lake)
MB with w/PCI4.0 full support.
CPU and MB to support DDR5 RAM
RTX 3080 Super (PCI 4.0)
PCI 4.0 SSD

I'm guessing this time next year? What do you guys think?
Alder Lake: Probably next year -- might be mobile first as well, which could mean longer
MB w/ PCIe 4.0: You can get X570/B550 right now, or wait for Rocket Lake from Intel in late 2020 / early 2021.
DDR5 will probably be 2021 for Intel (Alder Lake) and 2022 for AMD (Zen 4)
RTX 3080 in September, not sure when 'Super' cards will arrive
PCIe Gen4 SSDs are plentiful but expensive; prices should drop some by the time the other parts arrive. I'd just get a good 1TB or 2TB TLC M.2 drive for half the price.

The more pertinent question would be: What's your current PC and do you actually need to upgrade everything?
 

JarredWaltonGPU

Senior GPU Editor
Editor
Meanwhile, I'll be waiting for the RTX3050.
I'm not convinced we'll see RTX 3050 -- maybe, but also maybe not. It will really depend on how much more performance the new and improved RT cores deliver. If they're four times as fast as Gen1 RT cores, sure (assuming they're not too large). Even if they're only twice as fast, maybe. But I suspect an RTX 3050 with Gen2 RT cores that are only twice as fast as the Gen1 RT cores will end up being a lot like an RTX 2060.

I figure a good estimate would be 20 SMs (1280 CUDA cores), with 20 RT cores. That's still potentially as much RT power as RTX 2070 (give or take), but with much lower general gaming performance. And it will probably cost at least $200, if not $229 -- that will be the RTX 3050 Ti. Then an RTX 3050 with 16 SMs could sell for $199.
 

Wolfshadw

Titan
Moderator
If my past graphic card upgrade history holds to pattern.

ATI X1800XL All-In-Wonder Released in 11/96
NVidia 8800GTS-640 Released in 11/06
NVidia GTX 1060 6GB Released in 7/16

I think I have six more years before I need to consider my next upgrade.

-Wolf sends
 
  • Like
Reactions: JarredWaltonGPU

King_V

Illustrious
Ambassador
I'm still chugging along on my XPS 8700 with 16GB and a GTX 1080. That card CAN struggle a little if I were playing modern games on my "75% of 4K" resolution, but since I've had less time for gaming, and I play older stuff, it hasn't been a problem (yet). The i5-4460 is, I think, the weakest link at this point.

I do have the upgrade itch - but my current system is meeting my needs. Kinda slightly wish I had a board that could handle an NVMe SSD properly, as my system has it at PCIe x1. Still, that's nitpicky. The SATA SSD is doing just fine.

I am looking forward to the successor to AM4, though. And I do want to go back to AMD for a GPU because the GTX 1080 and Nvidia's drivers do NOT handle FreeSync correctly with my particular monitor (yet, no AMD card has had any trouble with it).
 
  • Like
Reactions: JarredWaltonGPU

InvalidError

Titan
Moderator
I'm not convinced we'll see RTX 3050
Nvidia is preaching RTX for the masses and this isn't going to happen if it leaves the entry-level which is still the largest segment of PC gaming grossly under-served. It'll happen in one form or another, only a question of when.

As for what sort of RT a 3050 would have, if Nvidia is buffing RT as much as rumors say it is, then I'd expect a 3050 to have at least 50% beefier RT than an RTX2060 since most people agree that 2000-series RT was so grossly under-powered as to be unusable. With 7-8nm's much greater transistor budget, Nvidia can afford to do so without breaking the silicon budget.
 

salgado18

Distinguished
Feb 12, 2007
925
363
19,370
If my past graphic card upgrade history holds to pattern.

ATI X1800XL All-In-Wonder Released in 11/96
NVidia 8800GTS-640 Released in 11/06
NVidia GTX 1060 6GB Released in 7/16

I think I have six more years before I need to consider my next upgrade.

-Wolf sends
Well, you did survive the first two times with top-of-the-line gpus, can a midrange 1060 handle another 6 years?

a 10-year-old GTX 460 is equivalent to a GT 1030
a 10-year-old GTX 480 is equivalent to a GTX 1050
(measuring SP-GFlops from wikipedia, I might be wrong)

maybe you should invest in top gpu again, otherwise seems a very good plan
 

Ratfish

Distinguished
Nov 1, 2013
10
5
18,515
Dang, I just got an RTX 2060 KO last week. I guess I could sell it and pick up the 30-series equivalent in a few months. I had no idea Ampere was coming out so soon.
 
The problem with the above performance discussion is that prices are a major unknown. Nvidia prices were relatively static between Maxwell and Pascal — about $30 more if you don't count the Founders Edition 'tax' — but from Pascal to Turing the high-end GPUs jumped 30-40% in prices, while the top model (RTX 2080 Ti) cost over 70% more than its predecessor!
Or another way to look at it, the performance gains for Turing were far lower than normal, so Nvidia shuffled around product names to hide that fact, and maybe trick some people into moving up to a higher tier than they would have otherwise. Going by pricing, the RTX 2060 was the successor to the GTX 1070, while the RTX 2070 was more a successor to the GTX 1080, and the RTX 2080 a successor to the 1080 Ti. The RTX 2080 Ti's price bracket would have normally made it a "Titan" card. So, the 2060 and 2070 didn't offer much more than 20% more performance than the prior generation, and the 2080 even less. It could be argued that some of the new hardware like RT cores offer additional performance, but games utilizing RT are still few and far between, and it's questionable how well that first-generation RT hardware is going to hold up over time.

Dang, I just got an RTX 2060 KO last week. I guess I could sell it and pick up the 30-series equivalent in a few months. I had no idea Ampere was coming out so soon.
It could potentially be a fair amount longer before they launch new mid-range cards around the $300 price point. In the case of the 20-series, they launched their high-end $600+ cards between late September and early October of 2018, but the $350 2060 didn't come out until late January, close to four months later, and the $280 1660 Ti didn't come out until a month later still. So, it's very possible that we might not see new mid-range cards until early next year. Likewise, AMD just launched their similar-performing 5600 XT for around $280 less than 6 months ago, so they may also be more focused on higher-end models this year. So, you might potentially be waiting half a year or more, unless you spend $100-$200 more for something in a higher-end bracket. I guess it's possible that we could see cards like the 2060 SUPER and 5700 XT go on sale for around $300 during the holiday shopping season if faster cards appear in the $400+ price range though.


I'm in the market and I'm doing my best to hold off;

How long do you think it will be until we can get a desktop with?;

Intel 12th gen k-series 10nm CPU (alder lake)
MB with w/PCI4.0 full support.
CPU and MB to support DDR5 RAM
RTX 3080 Super (PCI 4.0)
PCI 4.0 SSD

I'm guessing this time next year? What do you guys think?
Probably longer than that. Intel's 10th-gen CPUs just came out a little over a month ago, launching with a brand new series of motherboards, so a year from now you will likely be looking at their 11th-gen processors. I wouldn't expect their 12-gen desktop CPUs until near the tail-end of next year at the earliest, and they might not even be available until early 2022. And at least from a performance standpoint, DDR5 might not offer much over DDR4 until its been on the market a while and higher speeds become available.

The 11th-gen processors will apparently offer PCIe 4.0, and the graphics cards coming later this year will undoubtedly support it, though I wouldn't expect much out of the interface performance-wise. Those graphics cards won't likely be limited by PCIe 3.0 to any measurable degree, and the performance benefits of PCIe 4.0 SSDs have so far generally been limited to copying large files and synthetic benchmarks. At most real-world tasks, you will likely struggle to notice any difference.
 
Last edited:

SmithJohn1

Prominent
Jul 10, 2020
1
0
510
If you're in the market for a GTX 1660 Super / GTX 1660 Ti, is it a bad time to buy now? Don't lower end SKUs take a long time after generational launches to come out?
 

JarredWaltonGPU

Senior GPU Editor
Editor
If you're in the market for a GTX 1660 Super / GTX 1660 Ti, is it a bad time to buy now? Don't lower end SKUs take a long time after generational launches to come out?
Yeah, that's sort of what I was getting at in the last section. I mean, there's no harm in waiting, but the budget stuff is the least likely to be immediately impacted by Ampere and Navi 2x. Even the mainstream ($300) probably won't change too much until perhaps December / January. But buying a high-end GPU right now (meaning, $400+) you'll likely discover something faster for not too much more money is coming in September/October.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
If my past graphic card upgrade history holds to pattern.

ATI X1800XL All-In-Wonder Released in 11/96

I had an X1800XL AIW. Was a big fan of the AIW series. However, the first ATI card I ever owned was a Radeon 8500, and I know that was released after 2000, so the X1800XL must have been released a few years after the 8500. There's no way that card was available in 1996.

Edit:
Here's a review from Hexus. Late 2005 looks like the release:

https://hexus.net/tech/reviews/graphics/3958-ati-all-in-wonder-x1800-xl-first-hands-on-look/
 

Wolfshadw

Titan
Moderator
I had an X1800XL AIW. Was a big fan of the AIW series. However, the first ATI card I ever owned was a Radeon 8500, and I know that was released after 2000, so the X1800XL must have been released a few years after the 8500. There's no way that card was available in 1996.

Edit:
Here's a review from Hexus. Late 2005 looks like the release:

https://hexus.net/tech/reviews/graphics/3958-ati-all-in-wonder-x1800-xl-first-hands-on-look/

Yeah... not sure where that date came from. I must have mis-read something there.

-Wolf sends
 

mrv_co

Distinguished
Jan 18, 2016
141
84
18,660
I'm really just wondering whether AMD can bring some Ryzen magic to their Radeon GPUs and put some actual pressure on Nvidia this go around.
 

JarredWaltonGPU

Senior GPU Editor
Editor
I'm really just wondering whether AMD can bring some Ryzen magic to their Radeon GPUs and put some actual pressure on Nvidia this go around.
I'd love to see it happen, and AMD should close the gap a bit ... but I suspect AMD's first generation of ray tracing enabled GPUs is going to be more like Nvidia's RTX 20 series in ray tracing performance. Rasterization performance could be more of the same, or it could be another decent improvement. We don't know what AMD is doing to the architecture, but if it really does deliver 50% more performance per watt, that should at least put it reasonably close to Nvidia.

This might be why we're hearing rumors of RTX 3090 -- basically, it could be another tier of performance (and price) just to ensure Nvidia stays in the lead. I can't see how Nvidia would do a GA100 consumer GPU for less than $2000, though -- not that it couldn't, but it tends to be very greedy on the extreme end of the performance ladder.
 
I'm really just wondering whether AMD can bring some Ryzen magic to their Radeon GPUs and put some actual pressure on Nvidia this go around.
Dollar for dollar they already are . The "ultra-high end" is such a small market, that AMD (at this point) doesn't really care, and correctly see it as a pointless pissing contest. An RX 5700XT pushes about as many pixels as an RTX 2070 Super - both firmly in the "high-end" GPU category. Cards above that price bracket have an absolutely minuscule market-share that isn't economic and thus not worth pursuing outside of bragging rights on review websites like this one.
https://store.steampowered.com/hwsurvey/videocard/
 
New GPUs are coming soon, so right now is a terrible time to buy a graphics card. If you absolutely have to have something right now, look for a budget option or a used card that will tide you over until AMD's Big Navi and Nvidia's Ampere arrive.

Right Now Is a Terrible Time to Buy a Graphics Card : Read more
Just trying to validate my purchase here

I have never owned a PC before, and was getting a good offer (about $30 off) on an RTX 2070S Windforce OC 3X 8G about a month ago, and I went for it, because GPUs are extremely costly where I live (2070S FE is $625), I figured why not buy it since I'm not upgrading from anything, and parts usually release overpriced here, is this a bad decision?
 

bc0203

Distinguished
Sep 19, 2009
16
1
18,510
This article was perfectly timed! I'm getting ready to upgrade my motherboard/CPU/RAM and already have a GTX 970.... will hold off till the 3070 arrives to upgrade my graphics card.
 
  • Like
Reactions: JarredWaltonGPU

kokotas

Commendable
Feb 3, 2018
21
5
1,515
I usually spend around 2.5k for a full build. This year it seems I will have to go a bit beyond that maybe around 3k. Imagine that 10900k + asus apex + gtx 3080 will be around 1800 Euros (retail prices) alone... That is if I can find them in stock once I decide to bite the bullet.
 

Cableaddict

Distinguished
Feb 20, 2014
53
0
18,540
Great article, thx.

Another factor you guys didn't mention is "heat vs power." Electricity and air conditioning aren't free!
Additionally, those that use mobile pc's (like me) have to worry about heat build-up the the small racks / cases we use.
And anyone using a pc outdoors in the summer, well .......
Each new generation improves significantly in this regard.