Question 4070/4060Ti/4060 Thoughts

Ar558

Proper
Dec 13, 2022
228
93
160
Given the disaster that is the 4070Ti, I was thinking the lower down the stack cards are surely going to be as bad in terms of spec and value. If we remember the 4070Ti was meant to be a 80 class card surely the 60Ti/60 at least will have only a 128 bit bus and 8GB RAM. I'm sure nVidia will ensure the clocks are enough to marginally beat their predecessors overall but given the MSRP's are likely to be a 20% minimum more than the 30 series equivalents, any gains will be below the 1% of performance for 1% price increase level making them woefully bad value? Hopefully AMD will actually punish them but as it stands nVidia are single handedly trying to kill PC gaming.
 
As Steve on Gamers Nexus has said, this entire lineup is a disaster and I'm going to lean onto this as he'd have a little more insight into this. The walkout by EVGA is now starting to make sense since it seems that Nvidia want us and their board partners to follow them into insanity. Yeah, the heating issue with AMD cards is not helping but at the very least, I know this is one generation of GPU's I'm probably dodging until the next one comes along.
 
My thought is that IC manufacturing has reached a point where the golden age of getting performance for free or cheaply because you moved onto the next process node is long gone. Everyone on the bleeding edge is going to continue charging a premium on top of premium because getting things to work at the scale they're at now is getting much harder. And you can't expect engineers and all that to put in more effort for the same amount of pay.

NVIDIA is taking advantage of their position, but from a as-neutral-as-possible perspective, they're ahead of the pack. Their ray tracing tech is better, their upscaling tech is better, there's probably a few other features that put them in a better position that doesn't get them a lot of press, and they were able to push something out the door first.

AMD isn't immune to taking advantage of their position either. People looked at them funny when they launched Zen 3 a higher price points than their previous Zen chips because AMD believed they were now a premium chip supplier. And they bumped up the prices again for Zen 4 (how do you justify charging $300 more for the flagship product over the last generation?)

If something is going to kill PC gaming, at least at the high end, it's not going to be a single company or a single decision. It's simply a multitude of decisions and factors that led up to this point.
 
  • Like
Reactions: mjbn1977
The GTX 970 was $329 MSRP. The 4070Ti is $799. Even if the 4070 is only $679, its still double what the GTX 970 MSRP was. The worst of it is that NVidia is extremely scummy for flatly lying about their performance gains over previous gen. They have always done this. I dont mean stretching the truth or even a big exaggeration. I mean straight up, flatly, objectively lying. Yes. MSRPs are only going up and up. The second hand market will follow that. You're worried about scalpers/miners? Lol. NVidia IS the scalper right now. Its terrible. We as the consumer enable this behaviour by continuously purchasing next-gen because we have to have the latest. The only way to stop is to vote with our wallets. But everyone will keep lining up at Microcenter like Cattle every time the latest gen comes out so it will never change. Hopefully enough 3090's and 3080's come to the second hand market for low enough prices to compel NVidia to stop being greedy
 
  • Like
Reactions: PEnns and Ar558
Given the disaster that is the 4070Ti, I was thinking the lower down the stack cards are surely going to be as bad in terms of spec and value. If we remember the 4070Ti was meant to be a 80 class card surely the 60Ti/60 at least will have only a 128 bit bus and 8GB RAM. I'm sure nVidia will ensure the clocks are enough to marginally beat their predecessors overall but given the MSRP's are likely to be a 20% minimum more than the 30 series equivalents, any gains will be below the 1% of performance for 1% price increase level making them woefully bad value? Hopefully AMD will actually punish them but as it stands nVidia are single handedly trying to kill PC gaming.

I wouldn't say that the whole line up is a disaster. According to several very qualified reviewers the 4000 series is actually very good when it comes to performance, features, efficiency, power, design, and (FE) build quality. BUT, it is expensive and Nvidia went kind nuts with the pricing. But, you can just say the cards are bad because of the price. The cards are not bad, the price is. Especially, Steve from Gamers Nexus is on some kind of "only price and value matters, so I don't even say anythng good about those cards anymore whatever how good they actually are" trip. At least other reviewers (Igor, PCgaminghardware, Jay, Hardware Unboxed and others) acknowledge actually quality, features and performance, even so they all complained about the insane pricing. And pretty much all said "very good product, but we don't recommend to buy it at this price". Also, AMD is not much better. Their new cards even so they lacking a lot of the features and are basically similar performance than 4080 and 4070ti are priced also incredible high. If they want to be the savior of PC gaming they should have launched those cards for at least $200 less. But they are as money hungry as Nvidia and went as high as they could. good for them, because the probably need all that cash to replace the thousands of 7900XTX cards with faulty coolers....

Back to Nvidias high pricing (and AMDs following along), yeah, its nuts. If the GPU shortage of the few years and the pandemic taught them one thing, it is that gamers are willing to pay ridiculous prices for GPUs. But the problem was that a big portion of those profits went into the pockets of scalpers. My feeling is that they Nvidia and AMD expected very high street prices with the 4000 and 7000 series cards and just don't want to give up any of the profit margin to scalpers. So they just increased the MSRP to scalper price level and keep the cash. It kinda worked out for them with the 4090, not so much for the 4080. I think prices will eventually go down over the next few months a bit. But never back to the pre pandemic prices. They probably also noticed that they went to far with pushing prices that high, but probably won't admit to their mistake. I think we will see more "price reductions" on new models down the product stack this year and they might rethink all this with next gen 2 years from now. But I don't think they will change MSRP on already released products.

Yes, those high prices have quite a bit of extra profit for Nvidia and AMD, but not all is only profit. Don't forget that 2021 TSMC increased their prices by 30% and eliminated volume pricing to large customers altogether. So, even if AMD or Nvidia using smaller dies, they might not actually get cheaper to make.

One last thought: what good would have lower MSRPs done? The RTX 3080 launched two year with an MSRP of $699. I tried to buy one 2 years ago and tried for 2 month until I gave up. I haven't seen one even close to MSRP until recently. I don't think normal gamers would have been able to buy one at MSRP, if the 4080 would have launched with a $799 or $899 price tag. Would have been scalpers paradise again. For gamers buying cards actually nothing changed. We still pay high prices. The only thing that changed is who gets most of the profits from the high prices. And being actually able to buy a card directly from a retailer, and not from a scalper, and actually finding one right away without refreshing browser for weeks on end, is actually a win for me.....
 
Last edited:
GN showed pretty clearly that the 3080 class cards could trade blows with the 4070Ti, and even outperform it in 4K.

You can still get a 3080 for about $800, and a 3080Ti for $850 new. So a used one is going to be more like what ohio_buckeye paid. From that perspective, I would probably go 4080 before getting a 4070Ti at that price. Or wait for the 4070 at hopefully $700? With no founder's cards, that doesn't seem likely.

I can't even conceive of what the 4060 and below cards might be ,$500, they've already stripped down to the 3060 level with the 4070Ti. Surely there is no more room to chop parts off and with 3060 still available for $350 they've left no room. And that is the chip they will have way over-produced.

I suppose they could make a 4060 8GB, but it might perform a lot worse than the 3060 12GB.
 
There were a lot of comparisons made to the 20 series launch. I think they've stuck their foot in it, which is why they are all canceling fab time. Increased prices with a downturn in people's ability to pay for it, coupled with a good chunk of their market having already upgraded in the previous generation.

They'll either need to increase the time between launches or actually provided a compelling reason to upgrade. Nvidia bet on ray-tracing, which hasn't really paid off.

Hasn't been a new major Direct3D launch in a while either (currently at about 7.5 years)
 
I wan't happy spending $600 for a GTX 2070S in 2019 either. It has been, and continues to to be, a good 1080p/1440p gpu. Was hoping to upgrade but not not sure when that will happen now. I understand that modern gpu's are nothing like those from 10 years ago but when a gpu rivals the cost of the pc build itslef? Just plain old sucks.
 
They'll either need to increase the time between launches or actually provided a compelling reason to upgrade. Nvidia bet on ray-tracing, which hasn't really paid off.

This is a great point. I was thinking about this the other day. We are currently at a point, where the GPUs (high end GPUs especially) are so powerful that CPUs have a hard time to keep up. So, we basically reached the point with this generation, that pretty much all GPUs on the market have enough power to run games with highest presets on 1080p and 1440p (even with raytracing in a lot of cases) and mid to high-end cards handling even handle 4k good enough. Nobody needs 8k anytime soon. Why not bring a new gen every 3 to 4 years and make worthwhile generational performance or feature upgrades of at least 74%. Most people only upgrade every other generation anyway. Also, we are getting to a point again where future game development in terms of graphics quality and improvement is held back by current console generations....what the rocking now? 12 TFLOPS or so? Current high end gen cards are between 40 and 80 Tflops....

Actually, I think raytracing paid off as in single player games where you have time to enjoy the landscape/city or whatever it is really impressive when implemented right. And its going that way because raytracing is take tons of workload off game developers table. Believe it or not, but raytracing is much easier for programmers than "conventional" lighting.....
 
I wan't happy spending $600 for a GTX 2070S in 2019 either. It has been, and continues to to be, a good 1080p/1440p gpu. Was hoping to upgrade but not not sure when that will happen now. I understand that modern gpu's are nothing like those from 10 years ago but when a gpu rivals the cost of the pc build itslef? Just plain old sucks.

The Supers were well priced compared to the launch prices.

2070 was $599, 2070 Super was $499, which made the partner cards about $100 cheaper than launch. ( I should add there were 2070 at $500 at launch, but they were all basic blower cards, if I recall)
 
Last edited:
Funny you say that when AMD, Intel, and ARM all make GPUs with ray-tracing acceleration. And then there's the fact that film and television industries use NVIDIA's tech for CGI (https://blogs.nvidia.com/blog/2022/03/10/oscars-best-vfx-rtx/). And multiple game engines have a ray tracing rendering pipeline.

Yeah, seems like it hasn't paid off yet.

I'm not saying their competitors aren't also doing it and that it doesn't have use cases. But the expectation that people will actually use ray-tracing, or purchase it specifically for gaming hasn't become a thing. Most people don't have ray-tracing capable GPUs to start with.

I've had a 3080Ti for a while, still haven't used ray-tracing. Not sure I have any games that support it at the moment, and even if I did I wouldn't want the performance hit at 1440p.

I vaguely recall a survey not too long back in which 82% of people responded that they hadn't used the ray-tracing on their GPUs. FPS is more important than visual quality for most gamers it seems.
 
The Supers were well priced compared to the launch prices.

2070 was $599, 2070 Super was $499, which made the partner cards about $100 cheaper than launch. ( I should add there were 2070 at $500 at launch, but they were all basic blower cards, if I recall)

To clarify, it was an ASUS ROG STRIX GeForce® RTX 2070 SUPER, a touch over $600 at Micro Center at the time. No compalint with it to this day, great gpu.
 

LOL....probably the 63% who don't own a Raytracing capable RTX or most recent AMD cards. If my card couldn't handle it I would turn it off, too. Very scientific poll......

I just glanced at the recent steam hardware survey and the majority of gamers still has cards I wouldn't use raytracing with neither.....(1650s, Nvidia 9 and 10 series, 2060s, tons of 3050s laptop gpus and pre 7000 AMD cards). So yeah....poll makes sense. Much more interesting would be a poll exclusively with 3070 and better (raytracing wise) users.

Since DLSS 2.0 I have Raytracing always on when available. And with the 4080 I don't even need that anymore, even so I use it due to less AA flickering in most games....
 
Last edited:
Within the last week is still 13%. That is probably in line with the percentage of people who have 3080/4080 or higher. But that 13% represents only a few percent on something like the steam hardware survey.

18.23% of people with Ray Tracing capability.
3.28% with an RTX 3080.
~1% with an RTX 3090

If 13% of those turn on ray tracing weekly, you are looking at about .5% of people using it.

Just not a mainstream thing yet, and won't be until probably the NEXT console generation is on the shelves.
 
Across 3 generations of RT capable cards, that many don't have one of them?

yeah, just look at the steam hardware survey. The majority of gamers don't have a good Raytracing capable card: https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

In my opinion it gets worth turning it on starting RTX 3080, maybe 3070 and RX 6900, depending on game and resolution. Measured on all gamers, not many people have cards like that. That's why that Linus Poll is stupid. Also, Linus Tech Tips is probably one of my least favorite Hardware youtube channels.....too click baity for my taste....
 
I'm not saying their competitors aren't also doing it and that it doesn't have use cases. But the expectation that people will actually use ray-tracing, or purchase it specifically for gaming hasn't become a thing. Most people don't have ray-tracing capable GPUs to start with.
Most people don't purchase hardware for a specific feature most of the time anyway. And I would argue most PC gamers don't even really care about what's in their computer as long as it plays the games they want.

As far as the adoption rate goes, new technology takes a long time to become "mainstream", whatever that means for you. A lot of people hold onto their hardware for 4-5 years, or at least two generations worth of video cards. I mean it was like what, 6-7 years before DX10 finally dropped off as a requirement. Which implies that according to the game developers market research, enough people had DX11 capable cards to consider dropping support for DX10.

I mean heck, imagine it's 2009 and you find out AMD/ATI has pushed hardware accelerated tessellation for the third generation with nobody supporting it. Are you going to say "ATI bet on tessellation, but it hasn't really paid off"?

I vaguely recall a survey not too long back in which 82% of people responded that they hadn't used the ray-tracing on their GPUs. FPS is more important than visual quality for most gamers it seems.
I'm sure people want visual fidelity, they just want it to be a free lunch.

I recall looking into the evolution of SSAO and people's opinion of it over time. It literally did start off as "don't enable it, the performance hit isn't worth it" to "enable it, it looks ugly otherwise"

EDIT: To provide some evidence of the above

From https://www.nvidia.com/en-us/geforce/news/battlefield-3-tweak-guide/#11
If you need more FPS, Ambient Occlusion is one of the key settings to disable.

From https://www.nvidia.com/en-us/geforce/news/witcher-2-tweak-guide/#8:
Add to that the fact that it is a subtle effect, and the end result is that for many, SSAO should be one of the first things to disable

From https://www.nvidia.com/en-us/geforce/news/deus-ex-human-revolution-tweak-guide/#6
SSAO is the single biggest performance killer, almost halving frame rates when set to High. As a very subtle effect, it should only be enabled on high-end systems where you have plenty of FPS to spare.

From: https://www.nvidia.com/en-us/geforc...e/#tom-clancys-the-division-ambient-occlusion
For consoles and low-power PCs, the efficient in-house Ambient Occlusion technique is an excellent option.

From: https://www.nvidia.com/en-us/geforc...tweaking-guide/#titanfall-2-ambient-occlusion
NVIDIA HBAO+ is the highest-quality Ambient Occlusion technique of its type, and is applied throughout Titanfall 2, enhancing every scene and moment. It does cost 9 frames per second, but given its far-reaching benefits we feel this is a cost well worth paying on any mid-to-high range GPU.

From: https://www.nvidia.com/en-us/geforc...ormance-guide/#watch-dogs-2-ambient-occlusion
With the reduced cost of HBAO+ in [Temporal filtering] mode, and the generally improved level of performance, HBAO+ and SSBC are far cheaper to enable, making them suitable for virtually all systems.

From: https://www.nvidia.com/en-us/geforce/news/call-of-duty-ghosts-graphics-and-performance-guide/#4
If your system simply can't cope with HBAO+, attempt to at least enable Low Ambient Occlusion – any level of AO is far better than none whatsoever.

So again, interesting how it went from as far as "this should be the first thing to disable" to "at least have some kind of AO"
 
Last edited:
And since developers/publishers want their games to be accessible to the masses... it's not really taking off.

Actually, developers really want Raytracing to take off as quick as possible, because this will make their work so much easier. For a developer it is much less time intensive to implement raytracing the traditional lighting. They save a lot of time by faking light and shadow effects, don't have to spend rendering time to prebake shadows just to see how it looks and then to re-render many times to make adjustments and see how it looks. With raytracing you see your changes during development in real time and you don't have to spend too much time on find how to fake and optimize the lighting.....

In contrary: it is taking off. Pretty much every AAA game coming out recently or in the near future has raytracing or is getting raytracing. Even older games now getting next gen raytracing support. Raytracing is here to stay and it is getting more and more important. And the games do look better. Sometime more subtle, other times very obvious. Especially in Cyberpunk 2077 it really looks great. When each light will reflect on from close buy walls and illuminate slightly in the color of the light. If you compare a scene with and without raytracing, one at night with a lot of neon advertising lights, that gets very obvious. The scene looks nice without raytracing, but when you experienced the raytraced version, you don't want to go back.....also reflections on glass etc and how light breaks through glass is just insane. Of course you need to play a game where you actually have time to appreciate all this. If you play a very fast competitive shooter you will not have time to take anything in. In that case only FPS count.....
 
Last edited:
Most people don't purchase hardware for a specific feature most of the time anyway. And I would argue most PC gamers don't even really care about what's in their computer as long as it plays the games they want.

As far as the adoption rate goes, new technology takes a long time to become "mainstream", whatever that means for you. A lot of people hold onto their hardware for 4-5 years, or at least two generations worth of video cards. I mean it was like what, 6-7 years before DX10 finally dropped off as a requirement. Which implies that according to the game developers market research, enough people had DX11 capable cards to consider dropping support for DX10.

I mean heck, imagine it's 2009 and you find out AMD/ATI has pushed hardware accelerated tessellation for the third generation with nobody supporting it. Are you going to say "ATI bet on tessellation, but it hasn't really paid off"?


I'm sure people want visual fidelity, they just want it to be a free lunch.

I recall looking into the evolution of SSAO and people's opinion of it over time. It literally did start off as "don't enable it, the performance hit isn't worth it" to "enable it, it looks ugly otherwise"

I think some of the statistics recently posted will help with the mainstream question. More games are produced without it than with it, and no one has launched a game where ray tracing is a minimum requirement. We are already at 4 years old for ray tracing. Current gen consoles are 2 years old, in another 4 or 5 years, I would expect the GPUs in the consoles to support ray tracing properly. And then, yes, mainstream.

I myself held on to my last GPU for over 5 years, and that is the main reason (outside of cost) that most people don't have a ray tracing GPU. Not sure anyone needed that pointed out.

Of course people want games to look good. For PC gamers, that is usually secondary to FPS.