News Nvidia Shares GeForce RTX 4060 Performance Numbers

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Here we can see why MS Flight Simulator is getting such a huge perf boost with Frame generation tech.

And because the 2060/3060 don't support this tech, so this shows that this new RTX 40 series "frame gen" tech shows a lot of disparity in results.

NVIDIA-GeForce-RTX-4060-Graphics-Card-Announcement-_8.png
 
nope, hence why i am still running a 1060. as i said before, all cards are at least 1000 at the top end, to 500 at the lower end too high, radeon or geforce. but people keep buying them, so the prices wont change.

Gotta give you props for that. Speaking with your wallet. Seriously. 👍
 
When nv starts pushing how much they'll save you in electricity cost as a selling point you know how bad of a deal the product is. They could just price the products more reasonably and perhaps more people would buy them.

My light bill went up $30 this month. Not sure if I need to blame the A/C or the 4090. 😆 😆
 
not at all, the mortgage, car payment, food, etc comes 1st be fore something like this.
the 4090 you bought, that is a mortgage and car payment for 1 month.
sorry to say, but those 2 things are more important then a vid card

I guess that depends on how much money you make... I bought a 4090 and still paid the car payment and the mortgage.

Life is too short to worry about being the richest person in the graveyard.
 
yes but the 70/80/90 are actually improvements of a new generation...the 60's are on 50 tier hardware wearing the 60 tier name.
No the 90 is the only one that is an improvement, every other card including the 80 are a tier above what they functionally are.

The Ada architecture is capable of having 18176 cores giving 95.42 Tflops of performance .
The Ampere architecture is capable of having 10752 cores giving 40 Tflops of performance.
Relative to those specs of what the architectures off cut down cards give you the following:

4080 - 53.5% of the cores giving 51.1% of the Tflops
3070 - 54.8% of the cores giving 50.8% of the Tflops

4070ti - 42.2% of the cores giving 42% of the Tflops
3060ti - 45.2% of the cores giving 40.5% of the Tflops

4070 - 32.4% of the cores giving 30.5% of the Tflops
3060 - 33.3% of the cores giving 31.9% of the Tflops

4060ti - 23.9% of the cores giving 23.1% of the Tflops
3050 - 23.8% of the cores giving 22.7% of the Tflops

4060 - 21.1% of the cores giving 20.4% of the Tflops
3050 - 23.8% of the cores giving 22.7% of the Tflops

Functionally each Ada card is giving you the same amount of the architectures performance that a full tier bellow it was giving you with Ampere, the 4060ti is even less than a full tier bellow!
 
4090 is the only 4000 series card worth the purchase. With no other card do you get a 60% performance increase over previous gen for only a 6% ($100) markup.

So many call it overpriced... I call it an amazing deal.

You couldn't be more right. Your posts in Tom's Hardware, influenced me towards buying 4090, and haven't regretted it ever since.

If paying for "overpriced" GPUs is inevitable, i'd rather buy an overpriced technological masterpiece, than an overpriced piece of trash, which i'll be looking to sell in a few weeks.

People who love gaming and are looking to buy a new GPU, but don't have an abundance of money to spend, will do well to save their cash for the top class player, which is 4090, in order to ensure the longevity of their rig.

I know a lot of people won't agree with that, but i've come to realize that this is the best way to save money in the long term.

Trying to do your job with an insufficient graphics card, will leave you unsatisfied. I've been down that road.

Compromises, cost dearly.
 
4Xi2xhWeEbuG6dxWdMeSEK-1200-80.png


I dont get the 11W Video playback power ... Thats too much for today standard .. Mobile Socs can playback every format at much less power ... more over the 1060 was 8W ... there is no excuse for this.
 
More like enough for a 1630 replacement lol

8GB is perfectly fine for most if not all modern games, provided you are using 2048x2048 textures. Most people go in and just slam the slider to "Ultra Death Metal" then complain when the stuttering begins. Instead set it to High/Ultra then manually dial down the texture slider and suddenly everything works just fine and you won't notice any difference unless your playing at 7680x4320. The bus interface speed is the real culprit, they deliberately nerfed the 4060 series with a slow bus to prevent it from competing with the higher SKU's, and they similarly nerfed those to prevent competing with the 4090. Basically, the entire 40 series product lineup is there to funnel you to buy the extremely inflated 4090.

And the 4090 isn't anything new, previous generations had the same style of card only it went by the name Titan and we all made fun of those folks due to how insanely overpriced it was. We all commented how much better the lower tier SKU's were for value, nVidia saw that and coupled with the Crypto price inflation decides the best move was to remove the better value cards and market the inflated Titan as the only real product.
 
I will say this again.

No point in counting DLSS 3 performance boost here,
Agreed. IMHO, DLSS3 is an entirely irrelevant metric for games as it generates frames without the used having the ability to interact with them. So, what may a smooth graphics be worth if the response latencies are skyrocketing?

DLSS/2 is another thing entirely, as they just increase resolution without adding latency.
 
I guess that depends on how much money you make...
i can afford a 4090, but when the price of the card is 1000 over what it should be, its not worth the price for me, regardless of the performance it gives. as i have said, these cards start at 2200 and top out at 2900, thats too much for a 4090, this is the price you should be paying for a 4090ti, or a titan.

to quote myself :
keep in mind, seem nvidia seems to have moved every tier up a level, and charges for that tier. the only card that seems to be named correctly, is the 4090, the rest take the x in the 40x0 name, and drop it down one level.
but at the same time the 4090 is priced as a 4090ti or a titan, so the whole rtx 40 series has been moved up a tier. IF nvidia were to release a 4090ti or titan, how much would those be ?????
 
You couldn't be more right. Your posts in Tom's Hardware, influenced me towards buying 4090, and haven't regretted it ever since.

If paying for "overpriced" GPUs is inevitable, i'd rather buy an overpriced technological masterpiece, than an overpriced piece of trash, which i'll be looking to sell in a few weeks.

People who love gaming and are looking to buy a new GPU, but don't have an abundance of money to spend, will do well to save their cash for the top class player, which is 4090, in order to ensure the longevity of their rig.

I know a lot of people won't agree with that, but i've come to realize that this is the best way to save money in the long term.

Trying to do your job with an insufficient graphics card, will leave you unsatisfied. I've been down that road.

Compromises, cost dearly.

Indeed they do.

Appreciate the comments. You're welcome. :)
 
Last edited by a moderator:
4Xi2xhWeEbuG6dxWdMeSEK-1200-80.png


I dont get the 11W Video playback power ... Thats too much for today standard .. Mobile Socs can playback every format at much less power ... more over the 1060 was 8W ... there is no excuse for this.
Without knowing what was tested between the two comparisons the comparison may not be apples to apples. All it says is it was tested with using AV1 codec, but it doesn't say at what resolutions or if all of the cards were tested using the same content.

As for mobile SoCs, if you watching something in 720p and testing at 720p it's not an equal comparison if the desktop GPU was tested using 4K/8K. There is a lot more data to decode at higher resolutions. You need to do an apples to apples comparison to make the claim that mobile SoCs are using less power to watch video.
 
i can afford a 4090, but when the price of the card is 1000 over what it should be, its not worth the price for me,

I personally don't think a Lamborghini should cost $250,000... but they do.

The $699 1080 Ti was 6 years ago... there is no way that a top tier card like the 4090 would cost $699 in 2023 given the way the market and the world economy has went the last 6 years.

Big pipe dream you got going on there.

As I said upthread... premium PC gaming should command a premium price. Even in a perfect world at a minimum the 4090 should be a $1200 card IMHO... but the world isn't perfect.
 
I still don't know why anyone turns DLSS on. Gaining 30% more frames in exchange for artifacts and a hefty increase of input lag / latency is NOT a good trade.
This is factually incorrect, partly because you're conflating two different things. And that's because Nvidia is making it easy to do so!

DLSS upscaling improves performance usually by around 30–50 percent, assuming the game is not limited. Along with the increased FPS comes lower latency. I turn on DLSS in games that support it all the time, because I've done real testing, with and without, and in most cases DLSS (especially at 1440p or 4K) looks roughly the same as native and performs a lot better.

DLSS 3 frame generation is the more controversial subject. I have played quite a few games now, with and without frame generation. Most of the time, it feels like a wash, even if the FPS counter suggests performance is 50% higher — and in a few cases it might even be 100% higher. If you go from 60 to 90 fps via framegen, it will still feel like 60 fps and you have added latency. If you go from 30 to 60 fps, it's even worse, because you get worse than native 30 fps latency. It's very noticeable IMO.

The problem is that Nvidia marketing always uses DLSS framegen and upscaling combined to promote the performance improvements of the RTX 40-series. Naturally, since RTX 30-series and earlier don't support framegen, that gives the 40-series a marketing advantage. But then you also get stuff like 4K using performance mode upscaling (1080p to 4K), plus frame generation. You absolutely don't get the same image fidelity with rendering 1/8 of the pixels and using AI to fill in the missing gaps, certainly not now and perhaps not ever. But you do get a lot more frames to the screen, if that's all you're after.
 
i can afford a 4090, but when the price of the card is 1000 over what it should be, its not worth the price for me, regardless of the performance it gives. as i have said, these cards start at 2200 and top out at 2900, thats too much for a 4090, this is the price you should be paying for a 4090ti, or a titan.

to quote myself :

but at the same time the 4090 is priced as a 4090ti or a titan, so the whole rtx 40 series has been moved up a tier. IF nvidia were to release a 4090ti or titan, how much would those be ?????

That's just it, the 4090 is a Titan, only with it's name changed to dodge the "lulz overpriced" label that got attached to it during previous releases.


There was never a 2090, just the 2080ti then Titan RTX. The 30 series used a 3090 as the top end card instead of a Titan.


The Titan / xx90 Halo products have always been outrageously priced with some of the worst price/performance in history. They were the equivalent to the gold iPhone and for people who wanted to brag about how much they spent on their rig. What is shocking about the xx40 series is how bad every other product is, almost like they are maliciously placed to make the golden iPhone look like a reasonable buy.



Model to model comparison shows how everything except the Titan/4090 got shifted up an entire product bracket. 4050 got labeled 4060 and so forth.
 
That's just it, the 4090 is a Titan, only with it's name changed to dodge the "lulz overpriced" label that got attached to it during previous releases.

Hahah... yep.

What is shocking about the xx40 series is how bad every other product is, almost like they are maliciously placed to make the golden iPhone look like a reasonable buy.

As said previously 4090 is the only 4000 series card worth the purchase. 60% performance improvement over the 3090... for 6% ($100) increase in price.

What's the alternative? The lesser 4000 series cards that in some cases don't even outperform their 3000 series counterparts?

Call it whatever you want.... the 4090 still flat out demolishes 4K Ultra RT while not even breaking a sweat and that is exactly why I bought it.
 
Hahah... yep.



As said previously 4090 is the only 4000 series card worth the purchase. 60% performance improvement over the 3090... for 6% ($100) increase in price.

What's the alternative? The lesser 4000 series cards that in some cases don't even outperform their 3000 series counterparts?

Call it whatever you want.... the 4090 still flat out demolishes 4K Ultra RT while not even breaking a sweat and that is exactly why I bought it.
The 3090 was $1200 msrp. That’s 60% performance for 33% increase in price

You are thinking of the 3090 Ti at $1500
 
  • Like
Reactions: palladin9479
Status
Not open for further replies.