News Nvidia GeForce RTX 3060 Ti Founders Edition Review: Ampere for Only $399

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
My suggestion would be to forget about 4K, since at typical viewing distances, the relatively minor differences in perceived sharpness will likely be difficult to perceive while gaming, and are arguably not worth the significant reduction in frame rates.

forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !

Also , with LG Oled TVs now supporting Gsync , they are the way to go.
 
  • Like
Reactions: Gurg
forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !

Also , with LG Oled TVs now supporting Gsync , they are the way to go.
Yeah I'm not sure why a PC gaming enthusiast would want their PC to be less capable of current gen consoles now!!
 

Freestyle80

Reputable
Aug 11, 2020
37
11
4,535
Another paper launch. How many millions of dollars did NVidia sell to miners for this one? Or is it "not powerful enough".....the tech industry is becoming a joke.


Dunno about your region but unlike AMD's 6000 series, Nvidia's 3070 has been in plentiful supply in my region, many OEMs sold 3070 systems during black friday itself.

I expect 3060ti to be the same here

forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !

Also , with LG Oled TVs now supporting Gsync , they are the way to go.

So you plan to stick a 60-80" LG screen for your PC?

Why?

Stop comparing consoles to PCs, they usually play on a big screen where the resolution increase is far more noticable, at 27" you'll be hard pressed to find the difference between 2k and 4k
 
forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !

Also , with LG Oled TVs now supporting Gsync , they are the way to go.
Using a 4K screen is fine (as it's more or less all you can find for televisions now), but that doesn't mean the extra pixels will make a significant difference to one's viewing experience. As far as watching video goes, aside from perhaps 4K blu-ray discs, the detail levels of most video sources are compressed so much as to make the extra resolution over 1440p largely redundant. Any 4K streaming service will appear a bit blurry on a 4K screen when viewed up-close due to the low bit-rates and their resulting artifacts, with the amount of data being transferred typically being lower than even a 1080p Blu-ray, and as a result, such content tends to look rather similar on a 1440p display.

As for games, sure you can render them at native 4K, but the performance hit is very large for what amounts to a minor increase in sharpness at typical viewing distances. Many will consider turning up the graphics options and getting higher frame rates at 1440p to arguably result in a better viewing experience. Of course, we're seeing new upscaling options gaining traction that allow lower resolutions to be upscaled fairly well, and something like DLSS or AMD's alternative have the potential to make viewing below-native resolutions a lot better in supported games.

However, that comes to the second issue, which is the current lack of reasonably priced high-refresh 4K gaming monitors. The only 4K screens that accept 120+Hz input under $1000 are a handful of 27" models, and that size is arguably a bit small to gain much benefit out of the extra pixels. And OLED panels are still a bit questionable for heavy PC use, as they still tend to be subject to burn-in from displaying the same image for extended periods, which is part of why you still don't commonly see the technology used for even high-end PC monitors.

Eventually, high-refresh 4K panels with OLED (or similar-performing technology) will make sense for PC gaming, but I don't think we're quite there yet, at least not for anything outside of home theatre PC use, where 1440p hasn't really been marketed as an option.

Yeah I'm not sure why a PC gaming enthusiast would want their PC to be less capable of current gen consoles now!!
The new consoles will be making heavy use of upscaling in most games to output a "4K" signal at a reasonable performance level, just as the prior-gen "4K" consoles did. You might occasionally get native 4K in some ports of games designed for older hardware, but the actual rendering resolution of most titles will likely be around 1440p or lower, with framerates targeting 60 or even 30fps in some cases, especially as the hardware ages and more-demanding games are developed.

And it's arguably fine that they won't often be rendering at 4K, since again, most people won't really notice much difference between 1440p and 4K while actually playing a game at typical viewing distances. The limited performance of these consoles is better put toward maintaining higher frame rates and rendering more detailed game assets rather than needlessly putting a significant amount of processing power toward making the image slightly sharper.
 
J
I still have absolutely no idea why Nvidia is using the "Ti" branding for this product.
That branding made some sense when they were cutting down a Titan X to make a 1080 Ti... What is it even supposed to mean at this point, when there's no "Titan" card and no base RTX 3060?
Just a name. Why 3060Ti need base card like 3060 to exist first? So far all the naming only indicate the cards positioning in the stack. It did not meant to dictate the card as a refresh or anything like that. Heck nvidia can even called it 3060 super if they want to. 3060 does not need to exist first before 3060Ti or 3060 super.
 
Meh. I'd wait for the RTX 3060 (non Ti). This thing is far too expensive imo for anything you would consider "mainstream". If you look at historic prices for GPU's (Tom's should do some homework), and you'll find I'm right, even taking inflation into account. GPU's have recently simply had a shift in their hierarchy pricing structure, where (we're supposed to believe) the current "mainstream" pricing is the old "high-end" pricing. I don't know what they would call the RTX 3090 or even the RTX 3080 back in the old days. "Rip-off" springs to mind.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
So you plan to stick a 60-80" LG screen for your PC?

Why?

48 inch only , Gsync , 120 hz , 1000 nits HDR , 1ms response time , 10bits billions of colors , FreeSync , ALLC (Auto low latency) , 40 watts speakers including subwoofer ..

https://www.lg.com/us/tvs/lg-oled48cxpub-oled-4k-tv

in the past I was using three 27 inch monitors for gaming , with this wonderfull TV at 48 inches there is no need to look at ugly bezels in between anymore !

Eventually, high-refresh 4K panels with OLED (or similar-performing technology) will make sense for PC gaming, but I don't think we're quite there yet,

For the Enthusiast ? we are there , you just need to spend $1500 . 48 inch is not that huge as well. three 27 inch screen setup is wider !
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
If you look at historic prices for GPU's (Tom's should do some homework), and you'll find I'm right, even taking inflation into account. GPU's have recently simply had a shift in their hierarchy pricing structure, where (we're supposed to believe) the current "mainstream" pricing is the old "high-end" pricing.
This isn't true, especially when adjusting for inflation. Go back 10 years, GTX 480 was $500 at launch, $600 adjusted for inflation. Go back 20 years to the Geforce 2 Ultra, MSRP was $500. Adjusted for inflation, that's over $750. Go back even further to 1998. Voodoo 2 launch price was $300. That's $480 adjusted and you still needed a 2d video card to pair with it.

Also, it makes no difference what the cards are called, or where they fall in the stack. Are you getting more performance and features for your $400 today than at any point in history? Absolutely. Those cards I listed from year's past, are completely unusable today.
 

randomizer

Champion
Moderator
Go back 10 years, GTX 480 was $500 at launch, $600 adjusted for inflation.

The equivalent card from that "performance tier" in this generation, the RTX 3080, launched at $699. So the price of that tier is now higher than before after adjusting for inflation, which was the point nitrium was making. The same is largely true for other tiers. Eg. if we look at the tier below it, usually the "xx7" models, we see the following (adjusted for inflation):

GTX 275 (2009): $300
GTX 470 (2010): $417
GTX 670 (2012): $452
GTX 770 (2013): $446
GTX 970 (2014): $359
GTX 1070 (2016): $485 (FE) / ~$409 (partners)
RTX 2070 (2018): $616 (FE) / ~$513 (partners)
RTX 3070 (2020): $499 (FE) / $499+ (partners)

There are some anomalies (both good and bad), but the general trend is a slow increase in real terms. It's worse if you include the Ti/Super models from Pascal onwards in this tier, but I chose the most conservative prices to compare.

If you compare the RTX 3060 Ti with the roughly equivalent cards from the 400 series, you'll see that the GTX 460 is ~$273 adjusted and the GTX 465 is ~$330 adjusted, which is quite a bit less than $400.

Are you getting more performance and features for your $400 today than at any point in history?

I can see where you're coming from, but you could make the same argument even if it was $700, as it beats the 2080 Super. I think that would be a hard sell for a card in the middle of the product lineup though. We should generally expect to get more for our money than in the past. Otherwise the price would be growing along with the performance and features, and even budget cards would require a personal loan now.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Funny, the author says:
The RTX 3060 Ti is affordable
and
for Only $399
$400 is not affordable by any means, maybe $200-250 is, but not $400.
And no it will not cost $400, except a few cards sold directly by nvidia to a few select countries in the world. The key words her are few and few...
Evryone else will have to either be a fool and pay $500-600 prices now or wait at leat 3 if not 6 month for the prices to get down to normal.
Yes it has good peformance, but the fact that a xx60 card now costs $400+ is really <Mod Edit> from nvidia (and I expect AMD to follow suit, so they won't have better prices for this performance tier).
 
Last edited by a moderator:

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
The equivalent card from that "performance tier" in this generation, the RTX 3080, launched at $699. So the price of that tier is now higher than before after adjusting for inflation, which was the point nitrium was making.

That's not what he said. He said the $400msrp of the 3060Ti is equivalent to high end pricing of the past, and with a few exceptions, that is factually incorrect. The highend has typically been well above $400, often before even adjusting for inflation as I pointed out.

I can see where you're coming from, but you could make the same argument even if it was $700, as it beats the 2080 Super. I think that would be a hard sell for a card in the middle of the product lineup though. We should generally expect to get more for our money than in the past. Otherwise the price would be growing along with the performance and features, and even budget cards would require a personal loan now.
From a consumer perspective, the only logical way to compare value is to pick price points and compare what you get now for that cost vs what you got in the past. Whether you're shopping for a $100 card or a $5000 card, the same process applies. It makes no sense to try and compare based on product names arbitrarily picked by a company.
 
Can't we all agree that affordability is subjective? A 4k card for ~700 USD is what you're paying for when you have a 4k monitor that costs almost the same price as the card.
This is notoriously a rather 'expensive' hobby depending on what you want to do. The 3060ti being priced at $400 and being the "go to" for 2k gaming at high frames?
I say that's reasonable. Double the price, you can double the resolution.
Atleast in the current climate. Prices fall as time goes on for these same pieces of hardware, and when the newer stuff comes out, I mean, we're still in the same ballpark for a pretty good system running around 1500-2000.
 

bigdragon

Distinguished
Oct 19, 2011
1,111
553
20,160
I don't like this situation where the GPU is the most expensive component in a computer. The CPU used to be the most expensive part. It's also hard to continue supporting the PC gaming ecosystem when a decent GPU costs more than a console of equivalent performance/experience. So many games are anti-modding or anti-user content these days that PC gaming has lost its edge.

Can't we all agree that affordability is subjective? A 4k card for ~700 USD is what you're paying for when you have a 4k monitor that costs almost the same price as the card.
You might want to check monitor prices. That tsunami of 4k displays hit earlier this year. They're quite affordable now. $700 for a monitor is ridiculous. Try half that or less. Also keep in mind that VR eats up a lot of pixels. People looking for 4k performance might be running a pair of VR 3k or 2k eyeball displays that can be demanding on GPUs. Decent VR is now starting at $300 too. GPU prices have been pushed up way too far relative to other system components.
 

Giroro

Splendid
J
Just a name. Why 3060Ti need base card like 3060 to exist first? So far all the naming only indicate the cards positioning in the stack. It did not meant to dictate the card as a refresh or anything like that. Heck nvidia can even called it 3060 super if they want to. 3060 does not need to exist first before 3060Ti or 3060 super.

Because without a 3060, then the "it's a little better than a 3060" branding doesn't mean anything. There's no frame of reference to indicate to customers what the card's position on the stack should be when there is no stack. Especially since, as pointed out by others, the xx60 cards are no longer targeted as mainstream/midrange. Branding matters. The GTX 1060 is still the most popular card on steam. This is at least partly because instead of offering its customers an upgrade path to a new ~120w card for $200-$250, Nvidia is trying to force an "upsell" path into a higher pricing/power tier.

So we are just left comparing everything to the overpriced/unpopular RTX 20 series.
We get 2070 Super performance a year and a half later for a <$100 discount? Wowee. How generous of Nvidia...
But why are they trying to sell it to people who would have never bought a 2070 Super, even at $400? And why aren't more reviewers calling them out for it?

Meanwhile, let me know when somebody launches a "mainstream" gaming card that doesn't cost more than a console, or every other component in a (well balanced gaming) PC combined - and I say that as somebody who has never spent less than $350 on a graphics card.
 
  • Like
Reactions: King_V

Giroro

Splendid
From a consumer perspective, the only logical way to compare value is to pick price points and compare what you get now for that cost vs what you got in the past.

That is actually usually the least logical way to compare value when it comes to technology. If everybody took that linear perspective, then we would be paying $15,000 for computers that are exactly 10x more powerful than a Commodore 64.
Which is ok, because video games would have died when publishers started trying to charge microtransactions for features that used to be free.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
That is actually usually the least logical way to compare value when it comes to technology. If everybody took that linear perspective, then we would be paying $15,000 for computers that are exactly 10x more powerful than a Commodore 64.
Which is ok, because video games would have died when publishers started trying to charge microtransactions for features that used to be free.
Almost any argument breaks down when taken to the extreme. When was the last time you bought an electronic device and determined whether you were getting good value by comparing it to something that's almost 40 years old from the same category? No one with common sense would do that. When you buy a new TV, you compare to models from the last few years, not a black and white RCA tube screen from the 1950's.
 
You might want to check monitor prices. That tsunami of 4k displays hit earlier this year. They're quite affordable now. $700 for a monitor is ridiculous. Try half that or less. Also keep in mind that VR eats up a lot of pixels. People looking for 4k performance might be running a pair of VR 3k or 2k eyeball displays that can be demanding on GPUs. Decent VR is now starting at $300 too. GPU prices have been pushed up way too far relative to other system components.

I'm waiting until 2021 to get the Acer XB23QK, but right now, anything that's 4k, 32", <1ms response time at 144hz on an IPS panel?
You got what, this?
https://www.amazon.com/ASUS-XG27UQ-...ywords=4K+Monitor+144Hz&qid=1606933952&sr=8-5

or the Acer XB273k, or the acer nitro--

You can settle for a 4k display but I'm talking the meat and potatoes of the 4k displays that everyone typically is after.
 

bigdragon

Distinguished
Oct 19, 2011
1,111
553
20,160
I'm waiting until 2021 to get the Acer XB23QK, but right now, anything that's 4k, 32", <1ms response time at 144hz on an IPS panel?
Ok, my mistake. I wasn't looking at anything above 100hz. You're right that the 144hz 4k monitors are stupid expensive. I don't play multiplayer competitive shooters, so top-tier monitor refresh rates don't matter to me.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
So we are just left comparing everything to the overpriced/unpopular RTX 20 series.
We get 2070 Super performance a year and a half later for a <$100 discount? Wowee. How generous of Nvidia...
But why are they trying to sell it to people who would have never bought a 2070 Super, even at $400? And why aren't more reviewers calling them out for it?

No. The 3060Ti is faster than a 2080 Super which was a $700 card. So, you're getting a $300 discount, over 40%. 3060 Ti is over 20% faster on average than a 2070 Super.


gqbML2Q2yVQwbYPBsv8gHR-970-80.png.webp
 
Last edited:
Ok, my mistake. I wasn't looking at anything above 100hz. You're right that the 144hz 4k monitors are stupid expensive. I don't play multiplayer competitive shooters, so top-tier monitor refresh rates don't matter to me.

I could've been more specific haha! But I read that with the newest console generation that these high refresh rate 4k displays should become cheaper and more available next year, because I really am not trying to pay the price of my gpu for my monitor. Now, HALF the price of my gpu?
Here's hoping companies double down for these consoles
 
  • Like
Reactions: bigdragon
FWIW, the categories of GPU prices have changed over time. Here's my take for today's market:

<$150: Don't bother. Seriously, skip it. I mean, if you want a previous gen GPU that will be okay, fine, but best-case scenario you get a GTX 1650 for that much money.
$151-$200: The new 'budget' range. GTX 1650 Super is probably the best option, RX 5500 XT 8GB as a runner up, but I'd still look to the next tier.
$201-$250: Lower mainstream. Yup, there are tiers of mainstream now. GTX 1660 Super if you can find it for $230 is good, RX 5600 XT at $250-$275 is better.
$276-$400: Upper mainstream. The RTX 3060 Ti lands at the top of this range. The thing is, if you're looking at any other GPU in this price bracket, it will be worse. RTX 2060 for $300? That's 35-40% slower for 25% less money. $390 for an RX 5700 XT would be even worse.
$401-$550: Lower high-end. Yeah, this gets messy. RTX 3070 or RX 6800 are the best options, and I'd lean toward the 3070 because of DLSS and RT, but whatever.
$551-$800: High-end. RTX 3080 gets my pick here, maybe RX 6800 XT if you can find it at MSRP.
$1000+: Enthusiast/Extreme. There's a gap in pricing, but everything $1000 and up is the domain of the enthusiast with deep pockets. Not generally recommended for most gamers.

As far as having a GPU cost more than any other component, for gaming it's by far the most important component so that's fine by me. CPUs have largely stagnated in their importance, unless you're doing 3D rendering (and even then a good GPU can help).
 
  • Like
Reactions: DMAN999
Why does this keep getting repeated? ... The only thing we can really fault Nvidia for here is altering the naming scheme with the 20-series to help disguise the mediocre performance gains that generation.
You answered your own question - NVIDIA's naming implies it to most people. ;)

This isn't true, especially when adjusting for inflation.
I guess it depends on where you are in the world. In September 2014, the price for an EVGA GTX 970 SC was £280, and a GTX 980 was around £480 (I can't remember exactly how much, as I decided it was too expensive, and got the 970) - that's equivalent to £321 [970] and around £550 [980] today (using RPI, which is an inaccurate and inherently inflationary measure of inflation). Looking at the prices for the 3000 series on Overclockers (3060 isn't listed):
  • 3060 Ti: £380-470 (both Palit models)
  • 3070: £519-699 (Inno3D & Gigabyte models)
  • 3080: £729-£950 (EVGA & Inno3D models)
I can't remember the exact prices, but the 1000-series and 2000-series saw price increases from the previous generation, when looking at the model numbers (i.e. comparing a 970 to to 1070 to a 2070). Obviously, Sterling has declined against the Dollar over this period, but that doesn't make up all of the increases.
 
  • Like
Reactions: nitrium
Because without a 3060, then the "it's a little better than a 3060" branding doesn't mean anything.

as i said it was just a naming. at best it will going to tell us in the future there will be slower GPU called GTX3060 in the future. GTX660Ti also exist before GTX660. and then look what nvidia did with 500 series before. we have GTX 560 > GTX 560Ti > GTX570. towards the end of 500 series life cycle nvidia decided to sell those defective GF110 chip that cannot even make it into GTX570 spec. nvidia end up calling them as GTX 560Ti 448 core. don't hung up too much to the naming scheme. this is more or less similar to when people think that nvidia Gx104 should be their x60 card instead of x80. in the end all of that does not matter to us consumer.

But why are they trying to sell it to people who would have never bought a 2070 Super, even at $400? And why aren't more reviewers calling them out for it?

what for? nvidia is not some charity. they can sell their x60 for $600 if they want to. just like they end up selling their RTX2080Ti at $1200 instead of $700 like GTX1080Ti. if you don't like it simply vote with your wallet. but don't force reviewer to agree with what you think.

Meanwhile, let me know when somebody launches a "mainstream" gaming card that doesn't cost more than a console, or every other component in a (well balanced gaming) PC combined - and I say that as somebody who has never spent less than $350 on a graphics card.

then just buy console? isn't that simple? PC is all about options. you can make a complete PC for the price of console. but don't expect the raw performance to be similar since console maker also have other things for consideration when they priced their console. even if they priced it at cost or below most often they hope to cope all the lost with the subscription and service they provide with their console. the only issue i'm seeing with pc gaming is some people let themselves to fall into trap that created by hardware maker like nvidia/amd/intel called high end gaming and when they need to spend big money to play their game on such setup they got angry for that.