News Nvidia RTX 40-series Super models revealed — 4070 Super coming Jan 17 at $599

Status
Not open for further replies.
Assuming 4070 Ti Super is 15% more powerful than 4070 Ti, and 4080 Super is 5% more powerful than 4080, 4080 Super would be about 12% ahead of 4070 Ti for a 25% higher price. Not the worst scaling I've seen, but not drool-inducing either.
 
Assuming 4070 Ti Super is 15% more powerful than 4070 Ti, and 4080 Super is 5% more powerful than 4080, 4080 Super would be about 12% ahead of 4070 Ti for a 25% higher price. Not the worst scaling I've seen, but not drool-inducing either.
Yes indeed, even at $999 the 4080 Super is still overpriced relative to performance scaling gen-on-gen. I would maintain that down the SUPER stack as well.

Nvidia is too greedy, and AMD does not/cannot compete well enough to create meaningful price pressure on Nvidia, so here we are. I will give Nvidia a small compliment on the $200 price reduction.

Looking forward to Intel's developments with Battlemage and further, but it may be years before we see a 3080-type moment again. Given supply chain issues and inflation/fab complexity, I would find $799 appropriate for a x80 class processor.
 
I'm in a tough spot with this lol. I'm shopping for a new card, and $1000 is about the height of my budget. Before I learned of Supers coming, I flopped between $800 4070 Ti and $1000 7900 XTX, and ultimately decided to go with 4070 Ti due to lower latency while using framegen.

The 4070 Ti Super for the same price and better performance seems like a godsend. Now I just can't decide if I wanna spend more and get 4080 Super instead to go even further beyond, even if it would be less efficient in terms of money/performance.

Guess I'll have to seem some benchmarks while vendors catch up - don't wanna go for Founder's Edition where applicable.
 
  • Like
Reactions: Why_Me
Jarred - an error in your chart, I think? It shows 64MB L2 cache for the 4070 Ti Super, but Nvidia's slide says 48MB.

I'm really looking forward to that card. I had my eye on the regular 4070 or Ti, but have been hesitating, because I want some room to upgrade to a 4K screen this/next year, and am worried those cards could age really quickly at that resolution. Glad I decided to hold off for a couple of months to see what Super brings, because the 4070 Ti Super looks like it would be perfect for me. : )
 
Before I learned of Supers coming, I flopped between $800 4070 Ti and $1000 7900 XTX, and ultimately decided to go with 4070 Ti due to lower latency while using framegen.
I saw you comment on another thread that you didn't think AMD had a latency reducing tech when using frame gen. As far as I'm aware Anti Lag works with FSR3 frame gen. If I were you I'd do more investigating before I buy a $800+ dollar GPU.
 
These new Super cards look like a step in the right direction -- a small step. I'm glad Nvidia didn't create yet another elevated price tier. The 4070 Ti Super is the most interesting option to me...although it is set at nearly the same price I paid for a 3090 in late 2022.
 
  • Like
Reactions: TCA_ChinChin
I saw you comment on another thread that you didn't think AMD had a latency reducing tech when using frame gen. As far as I'm aware Anti Lag works with FSR3 frame gen. If I were you I'd do more investigating before I buy a $800+ dollar GPU.
I did look into it. As I understand it, Anti Lag is an entirely software solution, and is inferior to Reflex, albeit it does have the advantage of working in all reasonably modern DirectX games.

Also I just don't like FSR tech in general, in most respects it seems like a helter skelter attempt by AMD to catch up to trends set by nVidia, rather than trying to come up with something of their own.

I'm not fanboying over nVidia, I have a lot of gripes with them actually, but I want the best and I'm fine with paying more to get it. Yes, I'm the average victim of capitalism.
None of you are interested in buying AMD, no matter what they offer.
Well, so far they seem to offer a lot of VRAM and slightly better raw rasterization performance at resolutions below 4k. Which is fine, just not something I'm personally interested in as a gamer, because I expect performance-taxing games to have at least some upscaling tech available, and maybe even framegen, and nVidia just wins that hands down.

AMD might have shot itself in the foot by allowing nVidia cards to take advantage of AMD upscaling / framegen features, particularly when using a mod/patch that allows older RTX cards to combine DLSS upscaling and Reflex latency reduction with AMD framegen.

It was a noble move for sure, but for some nVidia users it means they have fewer reasons for upgrading from their older RTX cards [to newer AMD cards].

The sad reality is that AMD are behind nVidia on the GPU market, not just in raw performance, but in technological presence as well.

Have to give credit where it's due, AMD's 7800X3D looks great for gaming. Despite being an Intel user for over a decade, I'm considering switching to AMD CPU.
 
Last edited:
I am very happy with my 7900XTX and do not regret getting it over the 4080. The XFX version I have is a 4 slot card with a very beefy and capable cooler, and a Z bracket to ensure proper support of the card.

Honestly does everything I need it do to, and more. Should really look at AMD before deciding to go with Nvidia. Adrenalin software package is way ahead of the Nvidia offering.
 
We know, people want price pressure on Ngreedia, so they can buy Ngreedia.

None of you are interested in buying AMD, no matter what they offer.

I honestly wish that AMD would just drop from PC gaming GPU's, just to see how much are all of you willing to pay for Ngreedias GPU's.

Then again, we will all be forced to do that anyways, since all of their proprietary tech keeps taking away our options and instead of fighting them, we are demanding it (Starfield DLSS anyone?)


Hahahah! "Ngreedia!" That's soooooo clever and funny! Are you a professional comedian? You certainly should be with a wit like that.

Man. heheheh. Ngreedia. How do you come up with stuff like that? I haven't heard material of that level since like, the third grade playground.

Kudos. Kudos.
 
I'm having a real hard time understanding the point of these.

It's seems like an excuse to make a minor modification in the manufacturing process to keep the particular plants going. It's like they are releasing these just because they can.
 
  • Like
Reactions: sundragon
We know, people want price pressure on Ngreedia, so they can buy Ngreedia.

None of you are interested in buying AMD, no matter what they offer.
If there weren't concessions made with some AMD products I'd purchase them over Nvidia, and I have in the past. It's strange you're giving people grief for picking favorites considering you obviously have a bias as well.

I'm having a real hard time understanding the point of these.

It's seems like an excuse to make a minor modification in the manufacturing process to keep the particular plants going. It's like they are releasing these just because they can.
I think it's more to attempt to address some of the criticism the 40 series has. I think it's a half-measure, personally, but at least the 70 parts might be marginally more compelling in the mid-range. I'm not holding my breath.
 
I did look into it. As I understand it, Anti Lag is an entirely software solution, and is inferior to Reflex, albeit it does have the advantage of working in all reasonably modern DirectX games.
Err, isn't Reflex also an "entirely software solution"? I think there was a Latency Analyzer feature released alongside Reflex that required specific monitors/mice, but as far as I can tell that does nothing to improve latency, it just makes measuring it easier.
 
Last edited:
Err, isn't Reflex also an "entirely software solution"? I think there was a Latency Analyzer feature released alongside Reflex that required specific monitors/mice, but aa far as I can tell that does nothing to improve latency, it just makes measuring it easier.
As I understand it, the hardware part is the Optical Flow Accelerator, which is used for DLSS stuff too.
I'm having a real hard time understanding the point of these.

It's seems like an excuse to make a minor modification in the manufacturing process to keep the particular plants going. It's like they are releasing these just because they can.
Well, there are two ways to look at it, pessimistic and optimistic.

Pessimistic is that nVidia first sells clearly overpriced cards to those who would buy them at that price point, and once the income starts drying up, they re-release mostly the same cards at lower price so that people who considered it too expensive before would buy them now.

Optimistic is that nVidia has optimized the manufacturing process and is now ready to sell better GPUs at same price / same GPUs at lower price.

The truth is probably somewhere in the middle. Plus it never hurts to curbstomp the competition.

Also 4070 Ti Super in particular offers meaningful improvement of more VRAM.
 
I'm having a real hard time understanding the point of these.

It's seems like an excuse to make a minor modification in the manufacturing process to keep the particular plants going. It's like they are releasing these just because they can.
It looks like an optics move to me. If the Super parts are indeed as near powerful as the next tier up the ranks then they are just reducing prices without actually reducing the price of their old parts. They could simply have lowered the prices of their existing parts but now they get to make a big deal and market "new" things.

*EDIT* If I recall they did this with the overpriced original 2000 series as well.
 
The RTX 4070 Super will launch on January 17, with an MSRP of $599, taking over the price point of the RTX 4070. To make room for the newcomer, the RTX 4070 MSRP will drop to $549, which doesn’t seem like quite enough, and if you check current retail prices, you’ll see RTX 4070 cards already starting at $549 and sometimes less. We expect performance to land relatively close to the RTX 4070 Ti, which means you get a substantial improvement in bang for the buck, as the 4070 Ti MSRP currently sits at $799.

Yet it's extremely easy to say "No" to it at that price because of the 12GB VRAM. Also that section in bold I would put a giant asterisk after because of the VRAM, and even if it does get close to the 4070 Ti it's stil a 2560x1440 75fps card, which isn't exactly the best for a mid-range GPU in 2024.

jr9QRFH2DPryJBJCivXcTL-1200-80.png.webp

VhdZvDR38QtRfomeAwu5NL-1200-80.png.webp
 
I don't mind these, but it really reinforces that anyone who doesn't want to spend over $400 has gotten the short end of the stick this generation period. Here's hoping Battlemage comes out this year and can deliver power/performance competition with better pricing.
I'm having a real hard time understanding the point of these.

It's seems like an excuse to make a minor modification in the manufacturing process to keep the particular plants going. It's like they are releasing these just because they can.
It's a price drop without dropping the price. I can't remember the last time nvidia officially dropped price on one of their products, but it happened early in the GeForce life and Jensen probably has a long memory.
 
now just to wait for real benchmarks to decide between the 4070 ti super or 4080 super fsr is just to far behind dlss for me not to mention raytracing
 
Here I was thinking that we might have a simplification of the current crop of cards (lower prices). Instead we get more variations, and this goes across the board with Intel and AMD too... Uggh 😕
 
Rolling in here with my 1070 and memories of what cards used to cost...
Is it me or is everyone just numb to the insane pricing?
Can someone explain to me what happened like I'm 5?
Inflation, pandemics, AI, AAA games, 4K gaming on the same sized screens, and rAyTrAcInG don't add up to Nvidia's price points.
 
Rolling in here with my 1070 and memories of what cards used to cost...
Is it me or is everyone just numb to the insane pricing?
Can someone explain to me what happened like I'm 5?
Inflation, pandemics, AI, AAA games, 4K gaming on the same sized screens, and rAyTrAcInG don't add up to Nvidia's price points.
It depends on how you look at it.

For some time, the hardware manufacturers rolled out superior hardware at the same price point from time to time. I.e. the pricing policy was "midrange GPU for 1080p gaming costs $250".

Then around RTX 20xx, the pricing policy changed. Now you paid extra to get new features and additional performance. I.e. "X tera flops costs $250. Pay more to get more tera flops and newer tech".

And since newer generation GPUs can dish out more tera flaps, "it's only natural" you have to pay more to get them.

But that's all academic. GPUs are getting more and more complicated, and new tech is expensive to develop and manufacture. All of the other things you mentioned likely amount to at least some of the price increase at the very least.

But most likely big part of raising the price is just to see if they can get away with it to make more money, and evidently they can.

It's also worth mentioning video games themselves. Poor game and game engine optimization due to big suits pushing for deadlines. And them being obviously in cahoots with GPU manufacturers. It's quite a symbiotic relationship.

People buy GPUs to get good framerates in new games while playing with visual enhancement features that tank the framerate. Raytracing is just the newest kid on the large block. Previously it was Hairworks, and before that it was PhysX. And probably more stuff I don't remember.

And then people buy new games to justify spending tons of money on new GPUs.

Like Control, the first "AAA" game to feature raytracing. I tried playing it, but the gameplay seemed pretty shallow and overall the game is extremely forgettable, more of a tech demo than an actual game, a primitive console-like title. But for a good bit you basically had to buy it to experience the raytracing tech.
 
Status
Not open for further replies.