News First Custom RTX 3060 Ti Graphics Card Spotted

$450 for a xx60 card. My 1060 wasn't but 330 and it was the fastest one MSI made 🙁 When crypto shot card prices up nVidia just left them there. On the CPU side we get more and more (cores/speed/cache, whatever) an d yet the prices stay roughly in t he same range. GPU prices are insane.
 
  • Like
Reactions: bigdragon
$450 (rumored)? sheesh, was hoping for something closer to the 300 mark, 450 still seems like x70 tier rather than x60. kinda salty/nostalgic for paying $310 (launch price) for my 660ti almost a decade ago, and havent upgraded since because of high prices.
 
$450 for a xx60 card. My 1060 wasn't but 330 and it was the fastest one MSI made 🙁 When crypto shot card prices up nVidia just left them there. On the CPU side we get more and more (cores/speed/cache, whatever) an d yet the prices stay roughly in t he same range. GPU prices are insane.
I agree that the $450 price point seems a bit high. If the rumored specs are correct, then it will only be 83% as fast as the 3070 at 90% the price. From a pure price/performance perspective that is not a good deal, the price should be closer to $400.
 
It seems a bit high, even for an xx60 Ti.

Then again, the 2060 non-super is around $300 minimum, isn't it? And the 2060 Super (at least today) seems to start at $400. This isn't all that surprising except that the 20 series started a bit overpriced, and the 30 series was supposed to remedy that.

If, say, the 3060 non-Ti is going to be around $400, I'm kind of alarmed at what the price of a 3050 or 3050 Ti would be. Or, like Turing, will Ampere only have non-Ray-Traced 50 series cards? And offer non-Ray-Traced 60 series?
 
If, say, the 3060 non-Ti is going to be around $400, I'm kind of alarmed at what the price of a 3050 or 3050 Ti would be.
I think I've seen $300 thrown around somewhere for the RTX3050 (Ti?) and yeah, that would be awfully steep. In the past, a move to smaller process usually came with substantially more performance per dollar at all price points. Here though, it looks like the "low end" is making a substantial leap up in prices at the same time high-end pricing gets rolled back.

Between AMD is jacking up prices on CPUs (disproportionately so on the 5600X) and Nvidia jacking up prices on lower-end GPUs, 2021 may turn into a really crappy year for budget gamers.
 
Lets see... $300 to $350 based on those other models... but ofcourse it all depends on how big the gpu chip is and how much They have memory.
3050 with 4Gb of memory would be hard sell at $300, but 3060 with 6Gb of vram could be very possible $300 candidate in these days...
 
If $450 holds to be true for the 3060 ti, yikes. I over paid for my GTX 1070 several years ago during the crypto boom, it seems like the prices never dropped after that. Might have to drop a tier or two and accept small gains for access to new technology, kind of hurts.
 
I have a suspicion that these 7nm nodes are in such demand with constrained capacity that they are now too expensive to support the mainstream price points (which I would call ~$200 for GPU, and $200 for CPU).
 
3050 with 4Gb of memory would be hard sell at $300
For the graphics details that 2070-level performance is intended for, I doubt anything less than 6GB would be viable.

I have a suspicion that these 7nm nodes are in such demand with constrained capacity that they are now too expensive to support the mainstream price points (which I would call ~$200 for GPU, and $200 for CPU).
The only 7nm GPUs Nvidia is making now are the GA100 compute monsters, everything else is on Samsung 8nm at least for now. While AMD and Nvidia may be constrained by their wafer supply, their wafer cost are set when the supply contract is signed and does not change with volume, so "cost from demand" is not a thing. If AMD's agreement says 9k$ per wafer, TSMC has to provide wafers for 9k$ up to AMD's maximum agreed volume. The main reason AMD and Nvidia prices are going up at the lower end is because they increased their profit margins to cash in on the fact that people are desperate for parts and entry-level are the least profitable parts.
 
I paid $360 for my 1070 only 4 years ago. The idea of an xx60-series card being more expensive than that is laughable. I think Nvidia is just trying to cash in on the demand for their higher tier cards. They're likely expecting some gamers just want a new Nvidia card regardless of the price and performance level. Inflated mining prices never truly went away, sadly.

Me? I'll wait. A 3060 won't be a significant enough upgrade for me to justify the cost. AMD will also likely push prices lower once their mid and lower end 6000 cards release. No rush.
 
  • Like
Reactions: shady28
I think I've seen $300 thrown around somewhere for the RTX3050 (Ti?) and yeah, that would be awfully steep. In the past, a move to smaller process usually came with substantially more performance per dollar at all price points. Here though, it looks like the "low end" is making a substantial leap up in prices at the same time high-end pricing gets rolled back.

Which seems like a bizarre turnaround from the Turing cards.

Turing: high end cards VERY expensive, but their most budget-level, non-RT, the GTX 1650/1650 Super cards FINALLY managed to outdo Polaris on price/performance. Admittedly, not at first, but more with the Polaris card prices rising a little after Christmas 2019 and not coming back down.

Ampere: Much more reasonable price/performance at the 3070/3080 level, but the "budget" cards may wind up being anything but budget.

Madness.


As for me, well, I hold on to older tech, usually, and when buying something new, really try to go for the deals. I kind of take pride in the "I bought X a year ago on sale, and the price for X has never been that low or lower since."

My most shining moments:
  • Powercolor RX580 8GB - $159.99 after promos/rebates, plus free games, back in December 2018
  • MSI Evoke RX 5700 - $272.99 after promos/rebates, plus free games, in March 2020
  • Gigabyte EAGLE GTX 1650 GDDR6 - $109.99 after promos/rebates, in August 2020
Am I being a bit smug? Yeah, probably, LOL. I may be impatient, but I generally don't let my impatience make me do anything all that hastily.

Though, uh, a really insane deal on something MIGHT make me decide that I need something more than I actually do.
 
Last edited:
Meh, I remember first time I laid down what I thought was serious cash for a top tier GPU. It was the GeFoce 256. I had been a 3dfx person prior to that but never put down $300 for a GPU until then.

It was the best card you could buy at the time, and it was under $300.

OR8PYOV.jpg
 
$450 for the 3060 Ti seems a bit unlikely, at least for the MSRP. We could see some partner cards with larger coolers for that much, though I expect the official starting price to be around $400, matching that of the 2060 SUPER. If it only cost 10% less than a 3070 for close to 20% less performance, then it wouldn't be a particularly viable product, so a 20% lower price than that card seems far more likely. And the 3060 (non-Ti) will probably start around $350, just as the 2060 did at launch.

$450 for a xx60 card. My 1060 wasn't but 330 and it was the fastest one MSI made : ( When crypto shot card prices up nVidia just left them there. On the CPU side we get more and more (cores/speed/cache, whatever) an d yet the prices stay roughly in t he same range. GPU prices are insane.
I paid $360 for my 1070 only 4 years ago. The idea of an xx60-series card being more expensive than that is laughable. I think Nvidia is just trying to cash in on the demand for their higher tier cards. They're likely expecting some gamers just want a new Nvidia card regardless of the price and performance level. Inflated mining prices never truly went away, sadly.

The prices aren't insane compared to the 10-series, Nvidia just shifted around product names with the 20-series to make the mediocre performance gains of that generation appear larger, and kept that new naming scheme for this generation as well. In reality, the 2060 and 3060 are what would have previously been marketed as "70" cards. The graphics chip used for the 2070/2060 was nearly as large as the one used for the 1080 Ti, and well over double the size of the chip used for the 1060. The 2060 certainly wasn't a 1060 successor, that was the role of the 1660 cards. Even with a process node shrink this generation, the chip used for the 3070 is still 25% larger than the one used for the 1070/1080.

People seem to be getting caught up on the model numbers, rather than acknowledging they have been shifted to cover a different range of products. AMD did a similar thing last generation as well, with their RX 5700 cards targeting a price and performance range over double that of the RX 570, with that trend continuing on down the stack.
 
  • Like
Reactions: spongiemaster
People seem to be getting caught up on the model numbers, rather than acknowledging they have been shifted to cover a different range of products.
You can shift the range of products however much you want, you still have to keep price points covered with products providing compelling performance per dollar if you want PC gaming to remain viable instead of turning into something only the bourgeois class of PC users can afford, especially when you account for the fact that inflation in the USA increases 2-3X as fast as median income.
 
For the graphics details that 2070-level performance is intended for, I doubt anything less than 6GB would be viable.


The only 7nm GPUs Nvidia is making now are the GA100 compute monsters, everything else is on Samsung 8nm at least for now. While AMD and Nvidia may be constrained by their wafer supply, their wafer cost are set when the supply contract is signed and does not change with volume, so "cost from demand" is not a thing. If AMD's agreement says 9k$ per wafer, TSMC has to provide wafers for 9k$ up to AMD's maximum agreed volume. The main reason AMD and Nvidia prices are going up at the lower end is because they increased their profit margins to cash in on the fact that people are desperate for parts and entry-level are the least profitable parts.

I said "demand with constrained capacity" which means TSMC / Samsung etc have the ability to raise prices and charge more per wafer.

There's a significant amount of evidence that 7nm costs more than previous generations. It's not all about the materials themselves (the wafers) - there are huge capital investments into these new fabs, including but not limited to things like EUV.

Example :
7nm SoC production cost drives up Xbox Series X console price
 
I said "demand with constrained capacity" which means TSMC / Samsung etc have the ability to raise prices and charge more per wafer.
No they cannot, prices are mostly locked in when the wafer agreement is signed and remain in effect until the agreement expires. The only way TSMC can raise prices is on new/supplemental agreements or by re-negotiating existing ones.

The series-S/X SoC costs more mainly because it is a relatively large die on a newer more expensive process.
 
You can shift the range of products however much you want, you still have to keep price points covered with products providing compelling performance per dollar if you want PC gaming to remain viable instead of turning into something only the bourgeois class of PC users can afford, especially when you account for the fact that inflation in the USA increases 2-3X as fast as median income.
I'd hardly say that's the case though. With few exceptions, you can currently run recent games near max graphics settings at 1080p and get over 60fps on a sub-$250 graphics card. Higher resolutions and refresh rates might be nice to have for those willing to spend more, but it's not really something that's required to run games well. Perhaps games will demand more powerful hardware as developers shift to targeting the new consoles, but that's always been the case. I wouldn't say the cost of "capable" gaming hardware has necessarily increased from what it was in the past.
 
$450 (rumored)? sheesh, was hoping for something closer to the 300 mark, 450 still seems like x70 tier rather than x60. kinda salty/nostalgic for paying $310 (launch price) for my 660ti almost a decade ago, and havent upgraded since because of high prices.
Indeed. I'm guessing $450 will translate into £420 (based on the pricing conversion for the RTX 3080 over here), which will put it at in a 50% increase on what I paid for my GTX 970 (when it was "new"). 🙁

(yes, I know of inflation, but consumer electronics has traditionally not suffered from it)
 
(yes, I know of inflation, but consumer electronics has traditionally not suffered from it)
Traditionally, there were more fabs than there was demand for processed wafers, so booking capacity on fabs was relatively cheap to ensure fabs kept running at sufficiently high capacity to avoid stop-start cycles. Now that companies and governments around the world gobble wafers like they are trying to build a Jupiter Brain, fabs are backlogged by a year if not longer and their clients allocate what wafers they're able to book to their most profitable SKUs, inflating prices on "budget" parts to cover the opportunity cost of allocating wafers to lower-end SKUs and make shareholders and executives with stock options happy.
 
they're making so few of them they raise the cost to compensate. in a years time the price will return to normal is the price curvature is doubled to compensate for this madness, time on x, price on y, madness in y the first half of the graph 0 to 5 on x, normalcy the second half of the graph 5 to 10 on x. its geometrical deduction i may be wrong ymmv. but 2021 the prices for the 3000 nvidia series should start to get much lower.
 
but 2021 the prices for the 3000 nvidia series should start to get much lower.
The more likely outcome is that Nvidia will discontinue the 3000-series ahead of the 4000-series' launch so most of whatever may be left of 3000-series stock will finish selling near or even above MSRP while people are frustrated that they are unable to get a 4000-series in the months following launch.
 
The more likely outcome is that Nvidia will discontinue the 3000-series ahead of the 4000-series' launch so most of whatever may be left of 3000-series stock will finish selling near or even above MSRP while people are frustrated that they are unable to get a 4000-series in the months following launch.
Except the 40-series probably won't be coming until late 2022 at the earliest, and even if they release refreshed cards halfway through this generation, it's unlikely those would make the pricing of existing models any worse.

Right now, there are likely a lot of people who skipped the last generation of cards who are now looking to upgrade. AMD still doesn't have their new cards out quite yet, but those should should help alleviate demand somewhat. Plus, Intel will likely be joining the graphics card market within the next year or so, meaning there should be another company manufacturing desktop GPUs, bringing with them increased competition and additional manufacturing capacity. Assuming there isn't some shortage due to massively increased demand or supply disruption in the interim, I would expect availability to become a lot better next year.
 
Plus, Intel will likely be joining the graphics card market within the next year or so, meaning there should be another company manufacturing desktop GPUs, bringing with them increased competition and additional manufacturing capacity.
Considering Intel's failure to scale manufacturing capacity to keep up with demand for over three years already compounded by delays and setbacks with process upgrades, it is unlikely that Intel branching out into (GP)GPUs and heterogenous computing is going to do anything to help it catch up with demand, especially if it lands large-scale contracts for AI research or anything else that may order large-Xe by the thousands.
 
Except the 40-series probably won't be coming until late 2022 at the earliest, and even if they release refreshed cards halfway through this generation, it's unlikely those would make the pricing of existing models any worse.

Right now, there are likely a lot of people who skipped the last generation of cards who are now looking to upgrade. AMD still doesn't have their new cards out quite yet, but those should should help alleviate demand somewhat. Plus, Intel will likely be joining the graphics card market within the next year or so, meaning there should be another company manufacturing desktop GPUs, bringing with them increased competition and additional manufacturing capacity. Assuming there isn't some shortage due to massively increased demand or supply disruption in the interim, I would expect availability to become a lot better next year.
How competitive do we really expect Intel to be with their first release? Unless Intel has figured out how to use chiplets for a gaming card, I don't think anyone realistically expects them to be competing in the upper echelon of the market which is where the shortages are. The early drivers from Intel scare me. They have a lot to prove before they are a viable alternative to AMD and Nvidia. If they release late next year, both Nvidia and AMD should have slightly faster refreshed cards out by that time.