News GeForce RTX 3060, RTX 3060 Ti Listings Allegedly Point to $450 Price Tag and November Launch

In any event, the price range looks at least plausible, considering that the GeForce RTX 3070 is priced at $499.

Actually, I think that isn't plausible. If they had many, many SKUs, one for every $50 segment from say $200-$500, then sure. But that wouldn't happen. They aren't gonna have a 3060 Ti just $50 below a 3070. I mean, the current 2060 Super is $400, and they haven't done any price hikes this-gen besides the 3090.
 
Actually, I think that isn't plausible. If they had many, many SKUs, one for every $50 segment from say $200-$500, then sure. But that wouldn't happen. They aren't gonna have a 3060 Ti just $50 below a 3070. I mean, the current 2060 Super is $400, and they haven't done any price hikes this-gen besides the 3090.
If one assumes the 3060 comes in at $400, $450 for the Ti isn't wholly implausible. Personally, I'd guess them spaced a bit further apart than that ($350 & $420?) but NVidia's being driven on this more by yields I think than pure marketing strategy.
 
There isn't much or any evidence that Nvidia is going to continue the 'Ti' branding.
It wouldn't even make much sense now that the Titan has been renamed 3090. And the "2080 Ti" equivalent is the 3080.
 
Personally I would put my bets on 400USD 3060Ti and 350USD 3060. Like 450 feels bit too close to 3070 at least I don't think nVidis wants to squish lineup as much from the get go, like they usually have bigger gaps, untill they fill then with Supers and Tis after whole lineup is out. And that requires bit of room between prices, especially if they need to react to RDNA2. So I expect seller here is overshooting a bit, since people will likely be happier if they get done money back, instead of having to pay extra. Then again low price feels a bit of undershoot... perhaps room for haggling and potentially getting people on board by having them to invest and playing psychology of invested people being more likely to invest further. Then again, I am purely guessing, since I got no idea how market works there.
 
  • Like
Reactions: Shadowclash10
@Endymio You sound like you are making excuses for Nvidia.

960->1060 $250->$300 20% increase
1060->2060 $300 -> $350 16% increase
2060->3060 $350-> $400 15% increase

Those are some big price hikes compared to inflation.

I can't wait till AMD makes them eat crow. Bring on competition.

I wouldn't count on this. Prices didn't drop dramatically with CPUs. Sure they got better, but prices increased. AMD will put out a slightly better card at same price. Unlike Intel, Nvidia hasn't been standing completely still.

Remember Vega? Or Radeon VII? Priced to Nvidia levels.
 
@Endymio You sound like you are making excuses for Nvidia.
I'm stating a fact. Inflation is a metric applicable only to identical products. Claiming a new product should be price-capped by an older one is an absurdly puerile stance. No "excuses" are needed for NVidia; they're a private company, they can sell their products for whatever price they wish, bound only by their own determination of self-interest. Your wants and desires are not laws of nature.
 
960->1060 $250->$300 20% increase
1060->2060 $300 -> $350 16% increase
2060->3060 $350-> $400 15% increase

Those are some big price hikes compared to inflation.

I can't wait till AMD makes them eat crow. Bring on competition.
AMD wants to make money too. They aren't going to price well below Nvidia and throw money away. If you can't produce enough to meet demand at $700, don't set the price at $600.
 
Personally I would put my bets on 400USD 3060Ti and 350USD 3060. Like 450 feels bit too close to 3070 at least I don't think nVidis wants to squish lineup as much from the get go, like they usually have bigger gaps, untill they fill then with Supers and Tis after whole lineup is out. And that requires bit of room between prices, especially if they need to react to RDNA2. So I expect seller here is overshooting a bit, since people will likely be happier if they get done money back, instead of having to pay extra. Then again low price feels a bit of undershoot... perhaps room for haggling and potentially getting people on board by having them to invest and playing psychology of invested people being more likely to invest further. Then again, I am purely guessing, since I got no idea how market works there.
That makes more sense. It just doesn't make sense for Nvidia to do a 3060 Ti at $450 - why would you get that when for $50 more you can have a card with 1000 more CUDA cores (note that we don't know that the 3060 Ti will actually have that many fewer cores). Toms says the 3070 vs 3060 Ti is a 2080 Ti (basically comfirmed) vs a 2080 Super (unconfirmed) essentially. 15% faster for $50 more makes the 3070 a no brainer, and the 3060 Ti would be a bad choice to buy.

On a seperate note, I have a bad feeling that the 3060 (which IMHO will be priced at $350, possibly around $320) will have just 6GB of VRAM - again. They CAN'T do a 3060 Ti at ~$400 with <8GB, because the 2060 Super is $400 w/8GB. And of course, the 3060 Ti can't have more VRAM than a 3070, so it has to have 8GB. When you consider that Nvidia dissapointed us with VRAM this gen, I don't see how the 3060 could have 8GB of VRAM - the same as the 3060 Ti, the 3070, and only 2GB less than a 3080. Oh, and this is another topic, but I just realized - Nvidia did shortchange us with the 3080 because the $700 1080 Ti had 11GB, while the 3080 has 10GB :/.
 
@Endymio You sound like you are making excuses for Nvidia.



I wouldn't count on this. Prices didn't drop dramatically with CPUs. Sure they got better, but prices increased. AMD will put out a slightly better card at same price. Unlike Intel, Nvidia hasn't been standing completely still.

Remember Vega? Or Radeon VII? Priced to Nvidia levels.

this was when crypto miners were buying out everything, made me hold onto that gtx 670......
 
I'm stating a fact. Inflation is a metric applicable only to identical products. Claiming a new product should be price-capped by an older one is an absurdly puerile stance. No "excuses" are needed for NVidia; they're a private company, they can sell their products for whatever price they wish, bound only by their own determination of self-interest. Your wants and desires are not laws of nature.

Call it whatever you want. I don't think its a fact, just an opinion. And my opinion is that you are biased.

We still can criticize them for bad pricing policy, especially after decades of better products at lower prices. "Laws of nature"? Really? So according to "Endymio's Law" RTX 5080 being 100% faster than RTX 3080 means it should cost $1400? And the RTX 7080 should be $2800?

It's a Law after all right? Do you know the semiconductor industry relied on "Moore's Law" for years which stated transistor count on a chip doubles every 2 years? Meaning approximately the same cost with tons of better performance?

@spentshells Pricing increase happened with the GTX 1080, which was before the mining hype.

Again AMD didn't care on reducing prices for its CPUs. Why would they do that for GPUs? All it needs to do is little better in price/performance. As in 5-10%.

We're not in year 2000 anymore where AMD would cut prices to be 1/3rd of Intel despite the Athlon being better than the Pentium 4's. It's sad, but that's what we got now.
 
We still can criticize them for bad pricing policy, especially after decades of better products at lower prices.
The point you've missed is that those decades are an aberration that will eventually end -- Moore's Law has already ended, in fact, at least under its original formulation. There is no stone tablet from Mt Sinai declaring that DavidC must receive better products at lower prices each and every year of his lifetime.

"Laws of nature"? Really? So according to "Endymio's Law" RTX 5080 being 100% faster than RTX 3080 means it should cost $1400?
Actually, it should be priced at whatever point exactly balances supply with demand. And while I appreciate you attempting to give me credit, it's not my law, its a law of economics.
 
AMD wants to make money too. They aren't going to price well below Nvidia and throw money away. If you can't produce enough to meet demand at $700, don't set the price at $600.

While I agree, and 7nm is in short supply too, they will lose good will. They no doubt have good products. But they ARE NOT on par with 30 series due to DLSS and RT.

AMD needs to win mindshare here. If they can deliver a 6900XT that runs as fast as a 6900 and offer it at 6800 Prices ($700), then we have a winner. 6800XT should be absolutely NO more than $650. $600 would be much better and my personal buying point.

I owned a 7970, 580, and 5700XT. I have no problem going over to team Green for usable ray tracing for $50->$100 more. And I loath NVIDIA. I go for value. That's what kept me buying AMD in the past. But the value isn't there for "close enough" rasterization and poor RT.
 
Remember when 60 series were $250 and $300 (960 and 1060)?

$400 isn't exactly mainstream value. Work em over AMD!
Except the 1660, 1660 SUPER and 1660 Ti were all well within that price range. It ultimately just comes down to whatever arbitrary model number they decide to market a given card with. The 2060 was arguably more of a successor to the 1070, not the 1060. Nvidia just shifted model numbers to obscure the limited performance gains of the 20-series outside of things like raytracing. As far as the size of the GPU chips go, the one used in a 2060 or 2070 is over 40% larger than what was used in a 1070 or 1080, only a little over 5% smaller than what was used in a 1080 Ti or Titan Xp, and well over double the size of what was used in the 1060. Even the 1660 and 1660 Ti's chip is over 40% larger than what the 1060 used, and only around 10% smaller than what was used by the 1070 and 1080. Being stuck on what was essentially the same process node as the 10-series meant Nvidia wasn't able to increase performance significantly without making the chips substantially larger, which greatly limited performance gains, which they tried to disguise by numbering their higher-end cards differently from the previous generation.

But they ARE NOT on par with 30 series due to DLSS and RT.
We don't actually know that though, as not much information is out there about AMD's raytracing implementation yet. For all we know, it could perform better than Nvidia's hardware, or perhaps be better at some effects and not as good at others. And AMD already has their own non-proprietary upscaling and sharpening tech that was in most ways better than the original implementation of DLSS, and is arguably not far behind DLSS 2.0 as far as the end-result is concerned.

If they can deliver a 6900XT that runs as fast as a 6900 and offer it at 6800 Prices ($700)
I think you may have used some wrong model numbers there. : 3

Personally, I think we're going to get relatively similar pricing for a given performance level compared to Nvidia's 30-series MSRPs. It seems to me that Nvidia likely obtained information about AMD's 6000-series plans many months ago, and found them to be too competitive at the high-end for their liking, causing them to launch their cards at lower price-points and with higher clocks and power draw than originally planned. That may also be why the VRAM gains are modest to nonexistent, since they likely shifted GPUs to different price points, and any VRAM increases had to go to make that happen.
 
too bad the 3090 isnt really a titan. its more like a 3090ti
View: https://www.youtube.com/watch?v=s23GvbQfyLA

The 3090 isn't a massively more expensive card with a single digit percentage increase in performance over the gaming flagship and a giant pool of memory which isn't remotely useful for gaming? (Because titan cards are not gaming cards.)
I'm not going to watch a 27 minute video to figure out why you said that, but the first 2 minutes of it definitely agree with me. And also Nvidia agreed with me when they called the 3080 their gaming flagship, which was previously the 2080 ti.
Not to mention the obvious equivalencies in hardware, such as the 3080 and 2080ti having the same amount of SMs as well as the 102 die designation being split between the Titan and the 2080ti/3080 gaming flagship with a smaller die being used for 2080/3070.
It could not be more obvious that Nvidia renamed the cards in an attempt to "embarrass" AMD by reducing their flagship's branding down to 3080 as a lower-tier and less-expensive sounding card in their (still very overpriced) product stack.
 
It could not be more obvious that Nvidia renamed the cards in an attempt to "embarrass" AMD by reducing their flagship's branding down to 3080 as a lower-tier and less-expensive sounding card in their (still very overpriced) product stack.
You were doing very well until that last bit. With cards flying off the shelves far faster than NVidia can mint the chips, their products are anything but "overpriced" at present.

(I'll now wait for someone to bite and reply that it's "just a supply shortage" issue....as if supply, demand, and price weren't all interrelated)
 
I'm not going to watch a 27 minute video

and why not ? dont want to admit that maybe you have also been misled perhaps ? nvidia themselves called this a " titan class " card, which IF you had watched the video, clearly stated it is NOT titan class. IF it was, it would have 48 gigs of ram, not 24, as each titan gen, except for one, doubled the ram, and charged alot more for it, in this case, 1000 more. not to mention the performance vs the top non titan card, is worse then previous titans. as i said, this is more of a 3080ti class card then it is a titan.