Shadowclash10
Prominent
At $400 it is far to expensive.
On a personal basis (IE, "I can't afford this"), sure. For the value? No, it isn't.
At $400 it is far to expensive.
My suggestion would be to forget about 4K, since at typical viewing distances, the relatively minor differences in perceived sharpness will likely be difficult to perceive while gaming, and are arguably not worth the significant reduction in frame rates.
Yeah I'm not sure why a PC gaming enthusiast would want their PC to be less capable of current gen consoles now!!forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !
Also , with LG Oled TVs now supporting Gsync , they are the way to go.
Another paper launch. How many millions of dollars did NVidia sell to miners for this one? Or is it "not powerful enough".....the tech industry is becoming a joke.
forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !
Also , with LG Oled TVs now supporting Gsync , they are the way to go.
Using a 4K screen is fine (as it's more or less all you can find for televisions now), but that doesn't mean the extra pixels will make a significant difference to one's viewing experience. As far as watching video goes, aside from perhaps 4K blu-ray discs, the detail levels of most video sources are compressed so much as to make the extra resolution over 1440p largely redundant. Any 4K streaming service will appear a bit blurry on a 4K screen when viewed up-close due to the low bit-rates and their resulting artifacts, with the amount of data being transferred typically being lower than even a 1080p Blu-ray, and as a result, such content tends to look rather similar on a 1440p display.forget about 4K ?? this is the new trend . all my friends upgraded to 4K but two ... and with HDR only "perfect" for 4K panels , you dont know what you are talking about !
Also , with LG Oled TVs now supporting Gsync , they are the way to go.
The new consoles will be making heavy use of upscaling in most games to output a "4K" signal at a reasonable performance level, just as the prior-gen "4K" consoles did. You might occasionally get native 4K in some ports of games designed for older hardware, but the actual rendering resolution of most titles will likely be around 1440p or lower, with framerates targeting 60 or even 30fps in some cases, especially as the hardware ages and more-demanding games are developed.Yeah I'm not sure why a PC gaming enthusiast would want their PC to be less capable of current gen consoles now!!
Just a name. Why 3060Ti need base card like 3060 to exist first? So far all the naming only indicate the cards positioning in the stack. It did not meant to dictate the card as a refresh or anything like that. Heck nvidia can even called it 3060 super if they want to. 3060 does not need to exist first before 3060Ti or 3060 super.I still have absolutely no idea why Nvidia is using the "Ti" branding for this product.
That branding made some sense when they were cutting down a Titan X to make a 1080 Ti... What is it even supposed to mean at this point, when there's no "Titan" card and no base RTX 3060?
So you plan to stick a 60-80" LG screen for your PC?
Why?
Eventually, high-refresh 4K panels with OLED (or similar-performing technology) will make sense for PC gaming, but I don't think we're quite there yet,
This isn't true, especially when adjusting for inflation. Go back 10 years, GTX 480 was $500 at launch, $600 adjusted for inflation. Go back 20 years to the Geforce 2 Ultra, MSRP was $500. Adjusted for inflation, that's over $750. Go back even further to 1998. Voodoo 2 launch price was $300. That's $480 adjusted and you still needed a 2d video card to pair with it.If you look at historic prices for GPU's (Tom's should do some homework), and you'll find I'm right, even taking inflation into account. GPU's have recently simply had a shift in their hierarchy pricing structure, where (we're supposed to believe) the current "mainstream" pricing is the old "high-end" pricing.
Go back 10 years, GTX 480 was $500 at launch, $600 adjusted for inflation.
Are you getting more performance and features for your $400 today than at any point in history?
andThe RTX 3060 Ti is affordable
$400 is not affordable by any means, maybe $200-250 is, but not $400.for Only $399
The equivalent card from that "performance tier" in this generation, the RTX 3080, launched at $699. So the price of that tier is now higher than before after adjusting for inflation, which was the point nitrium was making.
From a consumer perspective, the only logical way to compare value is to pick price points and compare what you get now for that cost vs what you got in the past. Whether you're shopping for a $100 card or a $5000 card, the same process applies. It makes no sense to try and compare based on product names arbitrarily picked by a company.I can see where you're coming from, but you could make the same argument even if it was $700, as it beats the 2080 Super. I think that would be a hard sell for a card in the middle of the product lineup though. We should generally expect to get more for our money than in the past. Otherwise the price would be growing along with the performance and features, and even budget cards would require a personal loan now.
You might want to check monitor prices. That tsunami of 4k displays hit earlier this year. They're quite affordable now. $700 for a monitor is ridiculous. Try half that or less. Also keep in mind that VR eats up a lot of pixels. People looking for 4k performance might be running a pair of VR 3k or 2k eyeball displays that can be demanding on GPUs. Decent VR is now starting at $300 too. GPU prices have been pushed up way too far relative to other system components.Can't we all agree that affordability is subjective? A 4k card for ~700 USD is what you're paying for when you have a 4k monitor that costs almost the same price as the card.
J
Just a name. Why 3060Ti need base card like 3060 to exist first? So far all the naming only indicate the cards positioning in the stack. It did not meant to dictate the card as a refresh or anything like that. Heck nvidia can even called it 3060 super if they want to. 3060 does not need to exist first before 3060Ti or 3060 super.
From a consumer perspective, the only logical way to compare value is to pick price points and compare what you get now for that cost vs what you got in the past.
Almost any argument breaks down when taken to the extreme. When was the last time you bought an electronic device and determined whether you were getting good value by comparing it to something that's almost 40 years old from the same category? No one with common sense would do that. When you buy a new TV, you compare to models from the last few years, not a black and white RCA tube screen from the 1950's.That is actually usually the least logical way to compare value when it comes to technology. If everybody took that linear perspective, then we would be paying $15,000 for computers that are exactly 10x more powerful than a Commodore 64.
Which is ok, because video games would have died when publishers started trying to charge microtransactions for features that used to be free.
You might want to check monitor prices. That tsunami of 4k displays hit earlier this year. They're quite affordable now. $700 for a monitor is ridiculous. Try half that or less. Also keep in mind that VR eats up a lot of pixels. People looking for 4k performance might be running a pair of VR 3k or 2k eyeball displays that can be demanding on GPUs. Decent VR is now starting at $300 too. GPU prices have been pushed up way too far relative to other system components.
Ok, my mistake. I wasn't looking at anything above 100hz. You're right that the 144hz 4k monitors are stupid expensive. I don't play multiplayer competitive shooters, so top-tier monitor refresh rates don't matter to me.I'm waiting until 2021 to get the Acer XB23QK, but right now, anything that's 4k, 32", <1ms response time at 144hz on an IPS panel?
So we are just left comparing everything to the overpriced/unpopular RTX 20 series.
We get 2070 Super performance a year and a half later for a <$100 discount? Wowee. How generous of Nvidia...
But why are they trying to sell it to people who would have never bought a 2070 Super, even at $400? And why aren't more reviewers calling them out for it?
Ok, my mistake. I wasn't looking at anything above 100hz. You're right that the 144hz 4k monitors are stupid expensive. I don't play multiplayer competitive shooters, so top-tier monitor refresh rates don't matter to me.
You answered your own question - NVIDIA's naming implies it to most people.Why does this keep getting repeated? ... The only thing we can really fault Nvidia for here is altering the naming scheme with the 20-series to help disguise the mediocre performance gains that generation.
I guess it depends on where you are in the world. In September 2014, the price for an EVGA GTX 970 SC was £280, and a GTX 980 was around £480 (I can't remember exactly how much, as I decided it was too expensive, and got the 970) - that's equivalent to £321 [970] and around £550 [980] today (using RPI, which is an inaccurate and inherently inflationary measure of inflation). Looking at the prices for the 3000 series on Overclockers (3060 isn't listed):This isn't true, especially when adjusting for inflation.
Because without a 3060, then the "it's a little better than a 3060" branding doesn't mean anything.
But why are they trying to sell it to people who would have never bought a 2070 Super, even at $400? And why aren't more reviewers calling them out for it?
Meanwhile, let me know when somebody launches a "mainstream" gaming card that doesn't cost more than a console, or every other component in a (well balanced gaming) PC combined - and I say that as somebody who has never spent less than $350 on a graphics card.
At $400 it is far to expensive.