Discussion NVIDIA Ampere RTX3000 series Discussion Thread(Updated Launch Specs)

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

larsv8

Distinguished
Jan 24, 2013
321
44
18,840
15
RTX3080 outperform RTX2080Ti by good 25-30% on avg. using Launch Drivers is no Joke. 70% avg. generation jump from RTX2080 is what we expected and what we got. Good. Really good.

Lets see How RTX3090 performs.
Yea can't wait to see upgrade from my 1080 to a 3090.

Just going to be a monster performance jump and I have been been impressed with my 1080.

I really recommend for everyone to wait two generations to upgrade. Good bang for the buck path.
 
Yea can't wait to see upgrade from my 1080 to a 3090.

Just going to be a monster performance jump and I have been been impressed with my 1080.

I really recommend for everyone to wait two generations to upgrade. Good bang for the buck path.
You should probably just go with a 3080 if you care about getting the most "bang for the buck". Going by their specifications, the 3090 will likely be less than 20% faster than a 2080 3080, but costs over twice as much.

That is, assuming you can actually find any of these cards at their advertised prices, since it's rumored that they will be in extremely short supply for months following their launch.

Edit: Fixed model number. >_>
 
Last edited:
You should probably just go with a 3080 if you care about getting the most "bang for the buck". Going by their specifications, the 3090 will likely be less than 20% faster than a 2080, but costs over twice as much.

That is, assuming you can actually find any of these cards at their advertised prices, since it's rumored that they will be in extremely short supply for months following their launch.
If he is going for 4K. Then I highly recommend going for RTX3090. For 1440p or 1080p RTX3080 will do fine.
 
Finally we are getting 4K to Mainstream. Till now there was no card to truly perform at 4K without compromise. RTX3090 is the GPU made to provide uncompromised 4K experience.
I don't think a $1500+ card can in any way be considered "mainstream". It's a "Titan" going by a different product name. And I doubt that extra small bit of performance is going to keep the card relevant much longer than a 3080. It will already need to rely on DLSS upscaling to manage around 60fps in last-year's Control, and future games will undoubtedly get more demanding still. Whatever next-generation $700 card Nvidia releases a couple years from now will undoubtedly be faster than a 3090, making paying that premium now probably not the best option for anyone seeking value.
 
I don't think a $1500+ card can in any way be considered "mainstream". It's a "Titan" going by a different product name. And I doubt that extra small bit of performance is going to keep the card relevant much longer than a 3080. It will already need to rely on DLSS upscaling to manage around 60fps in last-year's Control, and future games will undoubtedly get more demanding still. Whatever next-generation $700 card Nvidia releases a couple years from now will undoubtedly be faster than a 3090, making paying that premium now probably not the best option for anyone seeking value.
Well RTX2080Ti selling from $1200-$1500 and even up was considered Mainstream. Why not the RTX3090. Whats wrong with it. It is $300 more expensive than RTX2080Ti Base. And named into RTX3000 Family. Why not consider it Mainstream.
 
I don't think a $1500+ card can in any way be considered "mainstream". It's a "Titan" going by a different product name. And I doubt that extra small bit of performance is going to keep the card relevant much longer than a 3080. It will already need to rely on DLSS upscaling to manage around 60fps in last-year's Control, and future games will undoubtedly get more demanding still. Whatever next-generation $700 card Nvidia releases a couple years from now will undoubtedly be faster than a 3090, making paying that premium now probably not the best option for anyone seeking value.
RTX4080 be faster than RTX3090 definitely. But that will be after 2Yrs from Now. And in the Mean Time RTX3090 will offer nearly the same experience. Over time when the Value Depreciates it will be worth the expenditure as the owner is able to get that Experience Two Years in advance than what Majority will be experiencing.

Purchasing RTX2080Ti in 2020 was a Bad Move. But all the People who purchased it in 2018 and 2019 and are using it for 1440p gaming which was meaningful resolution to pick. They did not overspend. They spent to get that smooth experience when rest of the world was lagging behind. Do they need to upgrade now. No they do not need to upgrade their GPU unless they are using it paired with 4K monitor(foolish move). RTX2080Ti users who spend high are fine having it and can upgrade when RTX4090/4080Ti rolls out.
 

larsv8

Distinguished
Jan 24, 2013
321
44
18,840
15
You should probably just go with a 3080 if you care about getting the most "bang for the buck". Going by their specifications, the 3090 will likely be less than 20% faster than a 2080 3080, but costs over twice as much.

That is, assuming you can actually find any of these cards at their advertised prices, since it's rumored that they will be in extremely short supply for months following their launch.

Edit: Fixed model number. >_>
Bang for the buck is good advice to others, but not something I had in mind for my build. Been saving for a while to build a juggernaut 4900x / 3090 system.

I am personally not satisfied with the 3080s performance. I'm going to be using an LG Oled cx 48, so I want to make sure I can do 120 hz at 4k with all the extras like ray tracing.

Will probably go with the 3090 strix assuming it reviews well. Somewhat concerned with that after it has seemingly been pushed back a bit with the 3080 variant.

I think will give me 2+ years of top of the line performance. The only thing that would majorly piss me off is if there ends up being a 3090ti
 
May have to rethink the 3090 if this leak is true:

https://videocardz.com/newz/nvidia-geforce-rtx-3090-gaming-performance-review-leaks-out

Only 8-10% faster. Was hoping for 20-30%

Is it possible that the 3090 partner cards or Overclocking provide a greater % increase than the 3080s?
Would recommend waiting till Official Benchmarks are out which will be before the Release of GPU.

NVIDIA is Marketing as a GPU which can run 8K@60Hz. Unless it is at-least 25-35% better in performance they wouldn't say that.

Wait for Official Benchmarks.
 
Is it possible that the 3090 partner cards or Overclocking provide a greater % increase than the 3080s?
It's possible that partner cards could get a little more performance out of the 3090. However, the same could be said about overclocked models of the 3080 as well, so the relative performance difference between them might not be too different. Assuming the 3080's 10GB of VRAM are not getting topped out in some scenario (which seems unlikely in today's games, even at 4K), the maximum possible performance difference shouldn't exceed 20%, since both cards use the same graphics chip, just with 20% more cores enabled on the 3090.

One thing potentially holding the 3090 back might be power delivery though, as it only has a 10% higher TDP than the 3080. With only 10% more power to work with, the 3090 likely isn't able to clock its cores quite as high under load, and there isn't much room to increase that without running into the limits of two 8-pin power connectors (Or Nividia's 12-pin, which is pretty much the same thing). So, I don't see at least the Founder's Edition having any significant overclocking headroom. Apparently some partner cards will be utilizing three 8-pin connectors though, allowing them to draw over 400 watts of power, which could probably get a little more performance out of the card, though again, probably still less than 20% more performance than an overclocked 3080.

And it's probably also worth pointing out that dumping that much heat into a case could potentially cause cooling issues. Even a stock 3090 has a 100 watt higher TDP than a 2080 Ti, putting out around 40% more heat. Your current GTX 1080 draws around 200 watts under load, so an overclocked 3090 could be putting out roughly twice as much heat.

NVIDIA is Marketing as a GPU which can run 8K@60Hz. Unless it is at-least 25-35% better in performance they wouldn't say that.
8K is four times the resolution of 4K, so it's clearly not doing native 8K at a playable performance level, unless perhaps we are talking about games from 10 years ago. Even if a card had double the performance of a 3080, that would not be enough to maintain native 8K at 60Hz in today's games at high settings. It sounds like Nvidia was referring to using DLSS upscaling to achieve "8K" output for games actually being rendered at resolutions around 1440p to 4K.
 
It's possible that partner cards could get a little more performance out of the 3090. However, the same could be said about overclocked models of the 3080 as well, so the relative performance difference between them might not be too different. Assuming the 3080's 10GB of VRAM are not getting topped out in some scenario (which seems unlikely in today's games, even at 4K), the maximum possible performance difference shouldn't exceed 20%, since both cards use the same graphics chip, just with 20% more cores enabled on the 3090.

One thing potentially holding the 3090 back might be power delivery though, as it only has a 10% higher TDP than the 3080. With only 10% more power to work with, the 3090 likely isn't able to clock its cores quite as high under load, and there isn't much room to increase that without running into the limits of two 8-pin power connectors (Or Nividia's 12-pin, which is pretty much the same thing). So, I don't see at least the Founder's Edition having any significant overclocking headroom. Apparently some partner cards will be utilizing three 8-pin connectors though, allowing them to draw over 400 watts of power, which could probably get a little more performance out of the card, though again, probably still less than 20% more performance than an overclocked 3080.

And it's probably also worth pointing out that dumping that much heat into a case could potentially cause cooling issues. Even a stock 3090 has a 100 watt higher TDP than a 2080 Ti, putting out around 40% more heat. Your current GTX 1080 draws around 200 watts under load, so an overclocked 3090 could be putting out roughly twice as much heat.


8K is four times the resolution of 4K, so it's clearly not doing native 8K at a playable performance level, unless perhaps we are talking about games from 10 years ago. Even if a card had double the performance of a 3080, that would not be enough to maintain native 8K at 60Hz in today's games at high settings. It sounds like Nvidia was referring to using DLSS upscaling to achieve "8K" output for games actually being rendered at resolutions around 1440p to 4K.
They did say 8K@60hz with DLSS Enable. I seriously am not expecting any GPU for at-least another 3 generations or even more to possibly run 8K@60Hz stable. But if they are claiming that then I expect RTX3090 to be a level above RTX3080.
 

Phaaze88

Glorious
Ambassador
After looking at Linus 8K Gaming RTX3090 Video. I cannot accept that RTX3090 is just 10-20% performance improvement over RTX3080. The performance jump cannot be that small.
I believe some reviewers already explained this. It's related to why the 3080 is amazing at 4K, 'just ok' at 1440p, and a real joke at 1080p:
View: https://youtu.be/csSmiaR3RVE?t=1637

(copy-pasted around the time Steve discusses this)
 
I believe some reviewers already explained this. It's related to why the 3080 is amazing at 4K, 'just ok' at 1440p, and a real joke at 1080p:
View: https://youtu.be/csSmiaR3RVE?t=1637

(copy-pasted around the time Steve discusses this)
Yes. Unless that is 10-20% increase in 1080p at which why would anyone recommend pairing RTX3090 with a 1080p Monitor. Well 1440p makes a bit more sense and to be honest RTX3090 is the GPU which I recommend pairing with 4K monitor. Finally SINGLE GPU that could Push 4K in any game at Ultra(NO COMPROMISE).
 

Phaaze88

Glorious
Ambassador
Hey, if one has the money, are only after the absolute best, and don't care about cost per frame, etc., no one can tell them how to spend it otherwise. I believe you've mentioned this a few times yourself.

As long as those folks refrain from trying to compare performance afterwards, because I'm just going to laugh.
 
Hey, if one has the money, are only after the absolute best, and don't care about cost per frame, etc., no one can tell them how to spend it otherwise. I believe you've mentioned this a few times yourself.

As long as those folks refrain from trying to compare performance afterwards, because I'm just going to laugh.
Well yest not always Performance/$ scales well. But for what the Top of the Line offers it may be worthy enough to spend on if the use case demands that extra Performance. Like I said RTX3090 is only for them who really wanna get NO COMPROMISE 4K experience over Longer Period of time. Even though lot many would say RTX3080 is more than enough for 4K it really isn't. For Practical No Compromise Long term usage without the requirement of Upgrade from time to time.

RTX3090 > 4K@60Hz
RTX3080 > 1440p@120Hz
RTX3070 > 1440p@60Hz
RTX3070 > 1080p@120Hz
RTX3060 > 1080p@60Hz

I am saying thin in respect to Demanding Titles and not Competitive Titles like Fortnite, CS:GO etc.
 
RTX3090 8K is a Joke. Even in current scenario in few limited Games it is hardly able to Manage decent FPS. In a long run of time it will be useless as hell. You won't be able to enjoy games for what they are.

I would prefer 4K Ultra settings over 8K Medium Settings. Visually 4K has more to offer than 8K in this case.
 

larsv8

Distinguished
Jan 24, 2013
321
44
18,840
15
If that is the Case then NVIDIA is screwing up for lot many Consumers.

And most importantly giving BigNavi a good chance to take up the Performance Crown Just Like That.
I was really aiming to max out 4k 120 hz with rt, etc. and the 3080 just doesn't get there, so I was planning on going 3090, paying the premium, and having the bee's knees for two years, while maxing out my display.

But this just feels dumb.

This feels like I drop the cash for a 3090, stomach it for six months, then get totally butthurt when a 3080ti gets released.

Im just praying the aftermarket cards end up being amazing, because this is really disappointing.
 
I was really aiming to max out 4k 120 hz with rt, etc. and the 3080 just doesn't get there, so I was planning on going 3090, paying the premium, and having the bee's knees for two years, while maxing out my display.

But this just feels dumb.

This feels like I drop the cash for a 3090, stomach it for six months, then get totally butthurt when a 3080ti gets released.

Im just praying the aftermarket cards end up being amazing, because this is really disappointing.
Lets see how reviews end up.
 

Phaaze88

Glorious
Ambassador
Im just praying the aftermarket cards end up being amazing, because this is really disappointing.
With what's been released so far, the aftermarket 3080s don't have all that much going for them; the ones released so far appear to be constrained by power limits.
IMO, all they appear to offer is ease-of-mind lower thermals and varying aesthetics, but they cost more than FE.
The FE produces perfectly acceptable thermals, plus the cooler design is slightly more efficient(dumping a little less heat inside the chassis), but the looks aren't going to be to everyone's tastes. Really competitive this time around.
 

ASK THE COMMUNITY

TRENDING THREADS