News Nvidia's RTX 4070 Ti Gets Official: RTX 4080 12GB Resurrected for $799

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
$799 is crazy expensive to me. 285 watt TDP is also something that is very high considering electricity prices here.

I will have to pass this GPU generation once more due to cost.

I also feel less of a need to upgrade GPU anymore. With so many games that run on a potato, and so many indie games, I have less and less need for an expensive GPU.

Nvidia has been hard at work trying to convince people to get excited about the RTX raytracing stuff, but a reflective puddle of water does nothing for me.

When I look at games on Steam, there are few games and few AAA titles that do require a high-end GPU, most games run fine on a 1060 in 1080p.
 
Last edited:

ManDaddio

Reputable
Oct 23, 2019
99
59
4,610
Someone needs to tell Nvidia the monopoly money .... er Crypto boom is over and gamers are tired of being raked over the coals. Vote with your wallets people.
The RTX 4080 is at least 25% faster than the RTX 3090 which launched at $1,400. The RTX 4080 is $200 cheaper and it's faster. The name of the card doesn't mean anything. The RTX 4080 despite the price is a better bargain.

And if you go in the other direction with the 4070 TI taking inflation into account that also is not a terrible deal if it performs about the same as an rtx 3090/3090ti.

There's no written rule in the universe that says you have to be 30% faster every generation.
 
  • Like
Reactions: coolitic and Why_Me

BX4096

Reputable
Aug 9, 2020
167
312
4,960
Someone needs to tell Nvidia the monopoly money .... er Crypto boom is over and gamers are tired of being raked over the coals. Vote with your wallets people.
I'm pretty sure that the kind of people who read Tom's Hardware are not the target demographic for this overpriced toy. Try Tik-Tok or YouTube.

Personally, I've waited for so long that I can easily wait another year or so for the pricing (not to mention, supply) to come down to earth. It's not even that I can't afford it. It's just I'd rather not waste my money or something that's not worth the price. You'd be surprised how much useful stuff you can buy once you stop wasting an unnecessary hundred or two on your every major purchase.
 

Giroro

Splendid
$799 would be way too much money, even if this were actually built like a x070 Ti Part. But it doesn't even rise anywhere close to that level.
12GB of memory on a 192-bit bus using a <300mm2 die doesn't describe the RTX 3070 Ti, or even the 3060 Ti. That describes the RTX 3060.

Even if this GPU were a RTX 4070. A ~$500 pricing tier is still beyond optimistic, IMHO.
 

JamesJones44

Reputable
Jan 22, 2021
665
597
5,760
285 watt TDP is also something that is very high considering electricity prices here.

There are a few issues with this statement.

  1. TDP is worst case scenario, how often are you running your GPU at 100%? Pretty rare, even when gaming. So unless you are training AI models, you will not hit 285 watts an hour.
  2. The 980 Ti and 780 Ti were 250 watts TDP and was considered reasonable for the performance at the time of release.
  3. Nothing that pushes 4K at decent frame rates is under 200 watts TDP. If your not interested in 4K gaming then there is no reason to buy 4070 Ti, or any other high end GPU released in the last 2 years to worry about the power draw.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
The 980 Ti and 780 Ti were 250 watts TDP and was considered reasonable for the performance at the time of release.

How is this relevant today. What does it matter what a 980Ti used back then. Who cares.

Today, in the UK, some people spend half their income on just electricity and heating.

But even in better off families, people play close attention to their electric bill. This includes computer usage.

If Nvidia, AMD and Intel want to keep selling CPU and GPU in half the world where electricity prices have skyrocketed, they better do something about their out of control power usage.

This is what happened with electricity prices:

kjhiljkljljl.png
 
Last edited:
  • Like
Reactions: PEnns

Giroro

Splendid
The RTX 4080 is at least 25% faster than the RTX 3090 which launched at $1,400. The RTX 4080 is $200 cheaper and it's faster. The name of the card doesn't mean anything. The RTX 4080 despite the price is a better bargain.

And if you go in the other direction with the 4070 TI taking inflation into account that also is not a terrible deal if it performs about the same as an rtx 3090/3090ti.

There's no written rule in the universe that says you have to be 30% faster every generation.

The 4080 is a better bargain compared to what, exactly? And by what standard?
No part of either the 4080 nor the 3090 has ever been a bargain, by any standard.
Maybe the RTX 4080 would have looked like a better deal if it cost $1,200 two years ago and competed with the RTX 3090 during peak crypto, but it's not 2 years ago, anymore.

A 20MHz intel i386 cost $599 in 1985 ($1660 with inflation). Does that mean buying a modern Core i3 would be a great bargain at $100,000 because it performs 1,000x better at only 60x the price? That's not how the generational improvement of technology, works.

Not that any of this crap matters, because there aren't any games that require anything higher than an RTX 2060. Humans are physically incapable of perceiving the difference between 8K240Hz Ultra and 4K120Hz High, so what even is the point? Where entering into the $20,000 solid-gold Audiophile-grade power-cable of pointless GPU snobbery. There are no good new games out; Ray Tracing is dead tech. Who cares, anymore, man?

Games take 3 years to develop, people went back to work 6 months ago. I'll check back in another 2.5 years/
 
  • Like
Reactions: PEnns

DSzymborski

Curmudgeon Pursuivant
Moderator
Performance requires power; there are plenty of low-power parts. But low-power and state-of-the-art performance are likely going to be mutually exclusive. There have been great strides in power consumption, for example a GTX 1650 is just as powerful as an HD 7970 yet uses a third of the power, while an RX 6600 destroys it on less than half.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
there are plenty of low-power parts.

Plenty? The last 75 watt Nvidia GPU that didn't use a dedicated GPU power connector was the 1650. From 2019.

The current Nvidia offering with the lowest TDP is the 3050. Still using 130 watt, which is almost double what a 1650 used.

Where is the 4050? Where are the GPU that an average person can afford? The cheapest GPU is now $800? PC gaming is now only for the rich? What fantasy world do these GPU makers live in.

The 1050Ti and 1060 are still one of the most used GPU on Steam. GPU from 2016.
 
Last edited:
  • Like
Reactions: PEnns

DSzymborski

Curmudgeon Pursuivant
Moderator
Plenty? The last 75 watt Nvidia GPU that didn't use a dedicated GPU power connector was the 1650. From 2019.

The current Nvidia offering with the lowest TDP is the 3050. Still using 130 watt, which is almost double what a 1050 used.

Where is the 4050? Where are the GPU that an average person can afford? The cheapest GPU is now $800? Jesus christ. What fantasy world do these GPU makers live in.

The 1050Ti and 1060 are still the most used GPU on Steam.

Yeah, because you can still use them effectively. The reason that every generation doesn't have a lot of 75W GPUs is because there's little demand for them. If there was really an upswell of people who needed cards like that, they would exist, so there's no need for the plea to faux-populism. A GPU is an entertainment product for the vast majority of people buying one.

There are a vast number of 1030s and 1050 Tis and 1650/Supers and RX 6500s out there. They already serve the people who really need a low-power GPU as do the rapidly improving integrated graphics solutions. There's no such thing as starving children dying in an alley because they can only play Cyberpunk at 1080p instead of 1440p.
 
  • Like
Reactions: Why_Me

DSzymborski

Curmudgeon Pursuivant
Moderator
Little demand? 75 watt GPU are consistently the most sold GPU.

sfsfsfsfsfsfs.jpg

Yeah, it's well-served by existing GPUs. People almost always buy these because they're cheap not because of 75W. Of course there's always going to be a market for something that's cheap.

If you haven't noticed, there has been a bottleneck in semiconductor fabrication. Using resources just to make a 75W GPU when the current 75W GPUs serve the market so well and making less money makes zero sense from Nvidia's point of view.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
If you haven't noticed, there has been a bottleneck in semiconductor fabrication

You're a year behind the news cycle. Nvidia has reduced chip orders due to lower RTX 40 series GPU demand (maybe it has something to do with charging $800-$2,000 for a GPU).

And TSMC and Samsung both warned of chip production overcapacity.

sfsfsfsff.jpg
 

DSzymborski

Curmudgeon Pursuivant
Moderator
You're a year behind the news cycle. Nvidia has reduced chip orders due to lower RTX 40 series GPU demand (maybe it has something to do with charging $800-$2,000 for a GPU).

And TSMC and Samsung both warned of chip production overcapacity.

sfsfsfsff.jpg

Umm, it's not like buying a bag of chips at the store. There's a significant amount of lead time involved, even before considering development time.

In any case, I know the punchline here; you Want to Be Super Angry, so everything will be a reason that you're Super Angry.
 

JamesJones44

Reputable
Jan 22, 2021
665
597
5,760
How is this relevant today. What does it matter what a 980Ti used back then. Who cares.

Today, in the UK, some people spend half their income on just electricity and heating.

But even in better off families, people play close attention to their electric bill. This includes computer usage.

If Nvidia, AMD and Intel want to keep selling CPU and GPU in half the world where electricity prices have skyrocketed, they better do something about their out of control power usage.

This is what happened with electricity prices:

kjhiljkljljl.png

Then don't buy a 4K monitor for gaming.

You're asking for something that never existed and can't exist for what you are asking for, that's the relevant point of my comment. High end graphic cards haven't been sub 200w TDP in almost 20 years. There are plenty of low powered options if you are that worried about draw, just don't expect to play games on anything higher than 1080p on ultra. Expecting someone to make a magic semi conductor at peak silicon that is somehow going to give you 120 FPS @ 4K while drawing only 150 watts is simply being unrealistic.
 
  • Like
Reactions: Why_Me

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
Expecting someone to make a magic semi conductor at peak silicon that is somehow going to give you 120 FPS @ 4K while drawing only 150 watts is simply being unrealistic.

It is clearly more than possible to have higher performance at lower watts than the 1650 from 2019, or even the 3050. The RTX A2000 has shown this clearly.

I don't know where you get this idea from that better silicon only affects the high-end.

It is Nvidia that does not want to update cheaper cards and expects everyone to fork out unprecedented amounts of money for a GPU.

View: https://www.youtube.com/watch?v=AvRDI1z-hh8
 
There are a few issues with this statement.

  1. TDP is worst case scenario, how often are you running your GPU at 100%? Pretty rare, even when gaming. So unless you are training AI models, you will not hit 285 watts an hour.
  2. The 980 Ti and 780 Ti were 250 watts TDP and was considered reasonable for the performance at the time of release.
  3. Nothing that pushes 4K at decent frame rates is under 200 watts TDP. If your not interested in 4K gaming then there is no reason to buy 4070 Ti, or any other high end GPU released in the last 2 years to worry about the power draw.
I'll say my GPU is almost always at 100% when gaming and usually near 30% over its base TDP.
 

watzupken

Reputable
Mar 16, 2020
1,030
522
6,070
Objectively, the MSRP for the RTX 4070 Ti is more palatable than the RTX 4080 when compared to the MSRP of the cards they are replacing, i.e. RTX 3070 Ti and 3080. But nonetheless, given the market condition, it is hard to shell out for a GPU starting at USD 799 before any tax. And for the price, this card is mainly targeting 1440p while the RX 7900 XT is 4K worthy with beefier specs, a larger cache and significantly higher memory bandwidth for 100 bucks more.
 

JamesJones44

Reputable
Jan 22, 2021
665
597
5,760
It is clearly more than possible to have higher performance at lower watts than the 1650 from 2019

Your making my point. The 1650 super isn't even equal to a 980 Ti that was released almost 5 years before the 1650 super.

It's about use cases, the 1660 super and the 3050 are solid for 1080p gaming on any setting, it's an entry level card that solves that specific problem of an entry level gaming rig. To get that same level of performance for 1440p you need to get up to a 2080 Ti performance wise. Given the minimal node shrinks we've had over the last 10 years I don't think you are going to get anywhere near that for under 150 watts TDP. Even the referenced 3050 is 130 watts and it only matches a 1070. Remember the 980 was on a 28 nm node, the 1650 super was on a 12 nm node, that's 16 nm in difference. The 3050 is on an 8 nm node and only meets a 1070 in performance, do you really think a 4 nm drop in size is going to get you to a 2080 Ti at 150 w TDP (if it's really even 4nm because TMSC describes their 4nm node as more like an enhanced 5nm node)? There is only so much that can be done with CPU design, the rest must come from smaller nodes, Intel proved this with their 14 nm + infinity run.
 

Co BIY

Splendid
There is only so much that can be done with CPU design, the rest must come from smaller nodes, Intel proved this with their 14 nm + infinity run.

I thought that Intel on 14+^ showed that sustainability/profitability is at an optimization spot involving the market, the competition and the technology and is not always on the bleeding edge.

Nvidia's highly successful and profitable 30XX cards were not on a leading node.