Early images of Gigabyte's RTX 3060 Ti Eagle have leaked.
First Custom RTX 3060 Ti Graphics Card Spotted : Read more
First Custom RTX 3060 Ti Graphics Card Spotted : Read more
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
I agree that the $450 price point seems a bit high. If the rumored specs are correct, then it will only be 83% as fast as the 3070 at 90% the price. From a pure price/performance perspective that is not a good deal, the price should be closer to $400.$450 for a xx60 card. My 1060 wasn't but 330 and it was the fastest one MSI made 🙁 When crypto shot card prices up nVidia just left them there. On the CPU side we get more and more (cores/speed/cache, whatever) an d yet the prices stay roughly in t he same range. GPU prices are insane.
I think I've seen $300 thrown around somewhere for the RTX3050 (Ti?) and yeah, that would be awfully steep. In the past, a move to smaller process usually came with substantially more performance per dollar at all price points. Here though, it looks like the "low end" is making a substantial leap up in prices at the same time high-end pricing gets rolled back.If, say, the 3060 non-Ti is going to be around $400, I'm kind of alarmed at what the price of a 3050 or 3050 Ti would be.
For the graphics details that 2070-level performance is intended for, I doubt anything less than 6GB would be viable.3050 with 4Gb of memory would be hard sell at $300
The only 7nm GPUs Nvidia is making now are the GA100 compute monsters, everything else is on Samsung 8nm at least for now. While AMD and Nvidia may be constrained by their wafer supply, their wafer cost are set when the supply contract is signed and does not change with volume, so "cost from demand" is not a thing. If AMD's agreement says 9k$ per wafer, TSMC has to provide wafers for 9k$ up to AMD's maximum agreed volume. The main reason AMD and Nvidia prices are going up at the lower end is because they increased their profit margins to cash in on the fact that people are desperate for parts and entry-level are the least profitable parts.I have a suspicion that these 7nm nodes are in such demand with constrained capacity that they are now too expensive to support the mainstream price points (which I would call ~$200 for GPU, and $200 for CPU).
I think I've seen $300 thrown around somewhere for the RTX3050 (Ti?) and yeah, that would be awfully steep. In the past, a move to smaller process usually came with substantially more performance per dollar at all price points. Here though, it looks like the "low end" is making a substantial leap up in prices at the same time high-end pricing gets rolled back.
$450 for a xx60 card. My 1060 wasn't but 330 and it was the fastest one MSI made : ( When crypto shot card prices up nVidia just left them there. On the CPU side we get more and more (cores/speed/cache, whatever) an d yet the prices stay roughly in t he same range. GPU prices are insane.
I paid $360 for my 1070 only 4 years ago. The idea of an xx60-series card being more expensive than that is laughable. I think Nvidia is just trying to cash in on the demand for their higher tier cards. They're likely expecting some gamers just want a new Nvidia card regardless of the price and performance level. Inflated mining prices never truly went away, sadly.
You can shift the range of products however much you want, you still have to keep price points covered with products providing compelling performance per dollar if you want PC gaming to remain viable instead of turning into something only the bourgeois class of PC users can afford, especially when you account for the fact that inflation in the USA increases 2-3X as fast as median income.People seem to be getting caught up on the model numbers, rather than acknowledging they have been shifted to cover a different range of products.
For the graphics details that 2070-level performance is intended for, I doubt anything less than 6GB would be viable.
The only 7nm GPUs Nvidia is making now are the GA100 compute monsters, everything else is on Samsung 8nm at least for now. While AMD and Nvidia may be constrained by their wafer supply, their wafer cost are set when the supply contract is signed and does not change with volume, so "cost from demand" is not a thing. If AMD's agreement says 9k$ per wafer, TSMC has to provide wafers for 9k$ up to AMD's maximum agreed volume. The main reason AMD and Nvidia prices are going up at the lower end is because they increased their profit margins to cash in on the fact that people are desperate for parts and entry-level are the least profitable parts.
No they cannot, prices are mostly locked in when the wafer agreement is signed and remain in effect until the agreement expires. The only way TSMC can raise prices is on new/supplemental agreements or by re-negotiating existing ones.I said "demand with constrained capacity" which means TSMC / Samsung etc have the ability to raise prices and charge more per wafer.
I'd hardly say that's the case though. With few exceptions, you can currently run recent games near max graphics settings at 1080p and get over 60fps on a sub-$250 graphics card. Higher resolutions and refresh rates might be nice to have for those willing to spend more, but it's not really something that's required to run games well. Perhaps games will demand more powerful hardware as developers shift to targeting the new consoles, but that's always been the case. I wouldn't say the cost of "capable" gaming hardware has necessarily increased from what it was in the past.You can shift the range of products however much you want, you still have to keep price points covered with products providing compelling performance per dollar if you want PC gaming to remain viable instead of turning into something only the bourgeois class of PC users can afford, especially when you account for the fact that inflation in the USA increases 2-3X as fast as median income.
Indeed. I'm guessing $450 will translate into £420 (based on the pricing conversion for the RTX 3080 over here), which will put it at in a 50% increase on what I paid for my GTX 970 (when it was "new"). 🙁$450 (rumored)? sheesh, was hoping for something closer to the 300 mark, 450 still seems like x70 tier rather than x60. kinda salty/nostalgic for paying $310 (launch price) for my 660ti almost a decade ago, and havent upgraded since because of high prices.
Traditionally, there were more fabs than there was demand for processed wafers, so booking capacity on fabs was relatively cheap to ensure fabs kept running at sufficiently high capacity to avoid stop-start cycles. Now that companies and governments around the world gobble wafers like they are trying to build a Jupiter Brain, fabs are backlogged by a year if not longer and their clients allocate what wafers they're able to book to their most profitable SKUs, inflating prices on "budget" parts to cover the opportunity cost of allocating wafers to lower-end SKUs and make shareholders and executives with stock options happy.(yes, I know of inflation, but consumer electronics has traditionally not suffered from it)
The more likely outcome is that Nvidia will discontinue the 3000-series ahead of the 4000-series' launch so most of whatever may be left of 3000-series stock will finish selling near or even above MSRP while people are frustrated that they are unable to get a 4000-series in the months following launch.but 2021 the prices for the 3000 nvidia series should start to get much lower.
Except the 40-series probably won't be coming until late 2022 at the earliest, and even if they release refreshed cards halfway through this generation, it's unlikely those would make the pricing of existing models any worse.The more likely outcome is that Nvidia will discontinue the 3000-series ahead of the 4000-series' launch so most of whatever may be left of 3000-series stock will finish selling near or even above MSRP while people are frustrated that they are unable to get a 4000-series in the months following launch.
Considering Intel's failure to scale manufacturing capacity to keep up with demand for over three years already compounded by delays and setbacks with process upgrades, it is unlikely that Intel branching out into (GP)GPUs and heterogenous computing is going to do anything to help it catch up with demand, especially if it lands large-scale contracts for AI research or anything else that may order large-Xe by the thousands.Plus, Intel will likely be joining the graphics card market within the next year or so, meaning there should be another company manufacturing desktop GPUs, bringing with them increased competition and additional manufacturing capacity.
How competitive do we really expect Intel to be with their first release? Unless Intel has figured out how to use chiplets for a gaming card, I don't think anyone realistically expects them to be competing in the upper echelon of the market which is where the shortages are. The early drivers from Intel scare me. They have a lot to prove before they are a viable alternative to AMD and Nvidia. If they release late next year, both Nvidia and AMD should have slightly faster refreshed cards out by that time.Except the 40-series probably won't be coming until late 2022 at the earliest, and even if they release refreshed cards halfway through this generation, it's unlikely those would make the pricing of existing models any worse.
Right now, there are likely a lot of people who skipped the last generation of cards who are now looking to upgrade. AMD still doesn't have their new cards out quite yet, but those should should help alleviate demand somewhat. Plus, Intel will likely be joining the graphics card market within the next year or so, meaning there should be another company manufacturing desktop GPUs, bringing with them increased competition and additional manufacturing capacity. Assuming there isn't some shortage due to massively increased demand or supply disruption in the interim, I would expect availability to become a lot better next year.