News Vendor Confirms RTX 4070 Ti is a Resurrected RTX 4080 12GB

It's still built like a RTX 3060 replacement. It's a cut down 12GB/192-bit card.
Even the measly 3060 Ti had a 256-bit memory interface.
The naming doesn't matter. At the end of the day, any MSRP above $329 will show Nvidia's wholesale rejection of gamers as a mainstream customer base.

Gamers' paychecks didn't triple over the last 2 years. Cost of living didn't decrease. Comcast didn't send me an email today about how they are going to lower their prices and improve their internet service. So why should we triple our entertainment budget?

Nvidia doesn't make gaming products anymore. I'll live with it. If they want to restructure their business around selling 200 luxury cards per year to celebrities and billionaires, then whatever. It's not my problem anymore. I'll care about Nvidia in the same way I care about whatever Rolls Royce watch Tom Cruise was paid to wear to the Oscars, or whatever.
They think they're too good for us. They don't want our dirty peasant money, so I'll happily spend it elsewhere.
 
I'd hardly call a 192 bit memory interface "generous" on a -70 class part, let alone a -70 Ti part. Realistically, this should be a 4070 at best with the performance it offers. Toms needs to quit fondling Nvidias marketing drivel and consumers need to quit funding Jensens leather jacket fetish with these outlandish prices
 
Some were suggesting it was a 4060 because of the 192bit memory bus; the last 3 gens were xx60.
Just like how the 4080 is a 4070/Ti at best, because... you guessed it: over 5 generations of 70 tier were 256bit bus.
But memory bus isn't everything, now is it...
It is not just the bus that is cut down.
The specs alone show the RTX 4080 12GB only has 7680 CUDA cores and 192-bit memory interface, down from the 9728 CUDA cores and the 256-bit interface in the 16GB model.
 
Now everyone complaining it's 192bit, I get you, but, it's still on pair with a 3090Ti people 😛 Yeah we were expecting the 4070 base to cost $500ish and deliver this, maybe $550 cause of reasons, but not $900 yeah. And to be honest.. I smell super scam with the Portal RTX running a 4090 at 20fps. Like.. this all RTX4000 series smells super scam! Go AMD!
 
I'd hardly call a 192 bit memory interface "generous" on a -70 class part, let alone a -70 Ti part. Realistically, this should be a 4070 at best with the performance it offers. Toms needs to quit fondling Nvidias marketing drivel and consumers need to quit funding Jensens leather jacket fetish with these outlandish prices


I agree 30% slower would be 70 class not 70 ti.
 
Honestly.. NVIDIA should have launched the 4090 as "TITAN RTX", priced it $2000, and launched the 4080/4070 by Christmas costing the average gen bump, maybe $50 to $100 more than usual. Then no one would be complaining. They could even blame scalpers for the cards to be put of stock and found on ebay dor $1000 plus. Instead they gave us this nonsense pricing and performance madness
 
  • Like
Reactions: artk2219
Lets be real. The 4080 should have been the 4070 Ti, the 4070 ti listed in the article should have been the 4060 ti (not the 60 like so many have said in here as it does fall just behind the old 3090TI) and Nvidia should have sloted in a 4080 with sku that had another 15% of performance vs what we got (a further cut down 102 die vs 4090). And the 4070 should be a cut down 103 die (current 4080 die). The last time we saw this level of die maniplulation was with the gtx 600 series.
 
Last edited:
One don't need to be a genius to figure out that the supposed RTX 4080 12GB is going to be renamed RTX 4070. It's not like Nvidia will go back to the drawing board to create a new chip. So it's just playing around with the model name. As long as the price is unchanged, I feel its going to get the same reception as the unpopular RTX 4080 16GB.
 
It's still built like a RTX 3060 replacement. It's a cut down 12GB/192-bit card.
Even the measly 3060 Ti had a 256-bit memory interface.
The naming doesn't matter. At the end of the day, any MSRP above $329 will show Nvidia's wholesale rejection of gamers as a mainstream customer base.

Gamers' paychecks didn't triple over the last 2 years. Cost of living didn't decrease. Comcast didn't send me an email today about how they are going to lower their prices and improve their internet service. So why should we triple our entertainment budget?

Nvidia doesn't make gaming products anymore. I'll live with it. If they want to restructure their business around selling 200 luxury cards per year to celebrities and billionaires, then whatever. It's not my problem anymore. I'll care about Nvidia in the same way I care about whatever Rolls Royce watch Tom Cruise was paid to wear to the Oscars, or whatever.
They think they're too good for us. They don't want our dirty peasant money, so I'll happily spend it elsewhere.

I think I'm going to keep my 6800XT for a very long time. Speak of the gerbil, my ISP did email me the other day, telling me that their rate is going up and I have no choice but to accept...

The thing is though I feel bad for those who just jumped in to the world of PC gaming. Sooner or later the stock for 3xxx series or even 2xxx series will run out, and they would have to swallow that new pricing. I know this too well, one of my relatives are planning to build a PC for gaming, and after checking the current prices he thought PC gaming is some high-end hobby meant only for those flush with cash, not for us "regular peasants". It's unfortunate.
 
But memory bus isn't everything, now is it...

No memory bus isn't everything so lets look at the whole architecture.
Full AD102 is 76.3M transistors giving 18432 shaders on a 384bit bus for 96.76 Tflops.
4070ti* is 35.8m transistors giving 7680 shaders on a 192bit bus for 40.09Tflops.

So lets look at the relative amount of the architecture to what the previous 3 generations of 70's gave.

4070ti* - 47% transistors giving 42% the shaders on 50% the buss for 41.5% the Tflops.
3070 - 61.4% transistors giving 54.8% the shaders on 66.6% the bus for 51% the Tflops.
2070 - 58% transistors giving 50% the shaders on 66.6% the bus for 49% the Tflops.
1070 - 61% transistors giving 50% the shaders on 66.6% the buss for 53% the Tflops.

Last 3 generations of 70 non-ti were pretty consistent and they gave you half the architecture for half the Tflops of performance while the 4070ti* is giving significantly less so it's not even a 70 non-ti let alone a ti. There wasn't a 10 or 2060ti but there was a 3060ti and what did that give you. . . 45% the cores for 40.5% the performance . . . . . . hey thats like the 4070ti*!

Now I wonder if they have a GPU which does give you what the previous 70 non-ti's gave you . . . oh they do lets have a look it's
60% transistors giving 52.5% the shaders on 66.6% the bus for 50.5% the Tflops.
Hey that GPU fits in EXACTLY with the previous 3 70 non-ti, same transistor count, same shader count, same bus, same performance, I wonder what card they'll end up putting that in . . . . oh wait it's already out cos it's the 4080 16GB! The 4080 16G is LITERALLY what the last 3 70 non-ti cards were except . . you know . . . fkn $1199 instead of $500!

People up in arms about the 4080 being pushed to $1199 when it's really the 4070 being pushed to the price and called an 80 to try and hide just how much they're trying to push up the price tiers.

Seeing how cut down the 4080* and 4070ti* are (since they should be the 4070 and 4060ti) and they still have the 4070* and 4060ti* to come, what the hell are they gonna put in the 4060*? At this rate it's only gonna be the same performance as the 3060 but cost 40% or something more :|
 
I like you analysis, I think what happened here is that the 4090 got out of the oven too powerful for what a flagship used to be, it should really have been marketed as the Titan, they didn't so to create this illusion that the 90 series is $2000 tier so the next in line should be somewhere next to $1500. Every gen bump has always been like tier lvl up because it becomes 50% better. The 7801.5=980. Then the 9801.5=1080. The 10801.5=2080. The 20801.5=3080, and finally the 30801.5=4080. By looking at the Witcher 4K benchmark at guru3d you can see this trend clearly, it has always been like this so the 4080 this time really is the 4080. This means that every tier always raised a lvl to beat the previous above it: the 80 series performance gets beaten by the new 70 series and so on.. example 990 series fps now 1080fps+.. now 2070 fps+.. now 3060 fps+.. now 4050 fps+... So in 3 generations (6years) someone got to play a "cheap" 3050 that is slightly better than a 980 that used to cost way more (3050>2060>1070>980). Again if you look at the Witcher 4K benchmark you'll see that's the case. So we could expect the 4080 would beat the 3090, the 4070 would be around it's performance and so on.. that's what's happening as usual EXCEPT Ngreedia is now charging almost double than usual for each tier because AMD got for some reason locked out of the RT scheme. People fell for the excuse that Portal RTX for example is "full path traced that's why it's so demanding" but no, actually it's some kind of programming made PURPOSELY to be really difficult to old gen run on, even Nvidia own cards. The 6950XT can't even run 5fps on Portal RTX 1080p, you want me to believe thats because the card can't calculate lights AT ALL? Or, you are telling me a 3090Ti which was launched at $2000 just last year can't run it past 10fps 4K because it sucks at RT really? It's a game inside a room fgs. The title isn't demanding, it's done like this on purpose to lock AMD cards out and market the 4000 series as "too much superior hence the price ". It's all artificial, it's a scam. The 4080 is just 50% better just like has always been and it never costed more out of inflation rates. I really think there's something like a patent or cryptography holding AMD hardware performance when it comes to RT, it's kinda like what they did with The Witcher 3 when it launched and called it "hair works" that was really hard for a AMD GPU to run. Now they came up with "Ray Tracing". If it really was just a matter of "tensor cores" all AMD had to do was to add those to the 7000 series.. but will they? Can they? If the 7900XT can't run Portal RTX at all we'll know the answer right there. Btw the 7900XT should beat the 4080 just like the 6900XT did the 3080 all the way across the board, it won't, but it will closely match it and will be launched at $900 instead of the original $1000 which is now devoted to the XTX which is really almost a 4090 at 6950XT1.5 level. This is really interesting.... will then the 7700XTX be launched at the 6700XT MSRP of $500 but with a 6750*1.5 performance? That would make it more of a 6950 than a 6900, and finally at around $400 the 7600XTX would have 6800 performance. AMD claims up to 75% performance gains on RT for the XTX, that would match Nvidia this gen as well, that's pretty good if you ask me.
 
Last edited:
  • Like
Reactions: lmcnabney
Vote with your wallet folks. Just don't buy NGREEDIA. Then they will learn. Simple as that. FWIW the average card on Steam is in the 1660-3060 range. These are perfectly affordable and are what game writers will test against and optimise their software to. The law of diminishing returns is alive and well. No one except whales and youtubers care about the ridiculously expensive crap. Calm down. You don't need it.
 
Goes to show that the only GPU worth buying from Nvidia is the 4090 because of its brute power and nothing else comes close. Everything below it so far seems terribly priced. Although AMD could bring something to the table, I'll wait for real results and not marketing/rumors.