[SOLVED] If I find a 3070 that is available, should I suck it up and just buy it instead of hoping for a 3080? And a question about monitors

TornAsunder

Distinguished
Jan 30, 2015
11
0
18,510
My current system is using a GTX 980 Ti from like 6 years ago when I first built it. It's lasted me awhile but now I feel I need to upgrade everything entirely. My current struggle is that, like a lot of other people I'm sure, I want a new computer as soon as possible so that I can play Red Dead Redemption 2 and other games like Control and DOOM Eternal and what not on ultra settings, 60+ FPS. I have yet to play many video games which have released this year and last year because I didn't want to spoil my experience of playing it for the very first time by attempting to play on what is basically outdated hardware now, that mostly maintains medium to some high settings in the games I do currently play, on a 1080p monitor. I know of somewhere which has RTX 3070s that are actually available but I was really hoping to try and snag an RTX 3080, or perhaps trying to wait until the end of May to see if there's any information that might come out about the RTX 3080 Ti. But I'm sure a 3080 Ti, should Nvidia actually be producing them and planning to release them some time this year, would sell out quickly as well. I try to go for a computer on the higher-end of things that I won't have to worry about upgrading for 4+ years which is why I got the GTX 980 Ti back in like 2015 when it released. Plus, I don't just play video games but I also like to occasionally mess around in DAZ3D to do (amateur) 3D rendering.

My questions for those who are more familiar with this stuff are as follows:
1. Is an RTX 3070 good for "future proofing" a computer so that I don't have to upgrade it for 4-6 years while still being able to play games at higher graphical settings with a 1440p monitor? Is it recommended that I upgrade to a 1440p monitor from my 1080p monitor once I finally get my new computer with an RTX card? Is an RTX 3070 reliable for ultra graphics gaming such as for some of the aforementioned video games?

2. Is an RTX 3090 "overkill" for gaming? Would it need one of those 4k monitors or would a 1440p monitor be fine? Because as of right now, I wouldn't even mind paying the extra for an RTX 3090 if I can even manage to find one of those available for MSRP.

3.How important is monitor resolution for gaming, anyway? Let's say I have a 1080p monitor and I'm playing a game on mostly ultra settings and then I go to play that same game on a 1440p monitor where I then have to lower a lot of the settings to maybe high and medium. Does the 1440p resolution still make the game look better graphically compared to a 1080p despite how its graphical settings are set lower? I've seen how much some of those 1440p and especially 4k monitors cost and I'm trying to understand why they are worth that price. Because it seems like playing a game on a 4K monitor would mean you'd probably be playing that game with a lot of settings set to medium compared to a 1440p monitor which can play on much higher graphics settings.

4. If Nvidia does release an RTX 3080 Ti, do we anticipate it being much better over a regular 3080 or are those Ti versions of the 980, 1080, 2080, etc., not actually that much different in the bigger picture of things?
 
Last edited:
Solution
it all depends on the settings you want to use and at what average frame rate.

considering you're relating this to the newest and most demanding games;

if you want to continue to reach high frame rates(>\=90fps) with highest settings and RT options enabled at high resolutions(>\=1440p),
than you are going to need at least a RTX 3090 or RX 6900 XT.

if you are okay with lowering some settings and disabling RT options,
than a RTX 3080 or RX 6800 XT should be fine for next few years at least.

RTX 3070 is similar to the RX 6700 XT and it already cannot do high fps at >1440p with high settings.
and definitely cannot do even decent fps with high settings and RT options enabled.

Bassman999

Prominent
Feb 27, 2021
481
99
440
You can use a 3090 on any monitor you want.
Just keep in mind 4k monitors are 60hz refresh rate and that 60 hz still at 1080p unless you buy a really expensive monitor that goes to 120hz or higher.

3080ti will be in the middle of the 3080 and 3090 with 12gb memory compared to the 10 of the 3080 and 24 of the 3090, and it will have more cuda cores than the 3080.

Monitor resolution is for eye candy, and reading text better on a larger screen. Also a 32" 1080p monitor at 2ft away on a desk wont look very pleasing, but would look fine at 6+ft away/
FPS gaming refresh rate is generally more important than resolution.
Decide what size screen you need then choose resolution based on that and if you need higher frame rate or better looks.

The 3070 is a higher midrange card is the current line-up.
Yes it will still work in 6 years, but not getting 120 fps 1440p on latest games in 6 years. You will have to drop settings and likely go to 1080p to keep refresh over 100 by then
 
it all depends on the settings you want to use and at what average frame rate.

considering you're relating this to the newest and most demanding games;

if you want to continue to reach high frame rates(>\=90fps) with highest settings and RT options enabled at high resolutions(>\=1440p),
than you are going to need at least a RTX 3090 or RX 6900 XT.

if you are okay with lowering some settings and disabling RT options,
than a RTX 3080 or RX 6800 XT should be fine for next few years at least.

RTX 3070 is similar to the RX 6700 XT and it already cannot do high fps at >1440p with high settings.
and definitely cannot do even decent fps with high settings and RT options enabled.
 
  • Like
Reactions: Bassman999
Solution

andypouch

Prominent
Oct 16, 2019
64
9
535
My current system is using a GTX 980 Ti from like 6 years ago when I first built it. It's lasted me awhile but now I feel I need to upgrade everything entirely. My current struggle is that, like a lot of other people I'm sure, I want a new computer as soon as possible so that I can play Red Dead Redemption 2 and other games like Control and DOOM Eternal and what not on ultra settings, 60+ FPS. I have yet to play many video games which have released this year and last year because I didn't want to spoil my experience of playing it for the very first time by attempting to play on what is basically outdated hardware now, that mostly maintains medium to some high settings in the games I do currently play, on a 1080p monitor. I know of somewhere which has RTX 3070s that are actually available but I was really hoping to try and snag an RTX 3080, or perhaps trying to wait until the end of May to see if there's any information that might come out about the RTX 3080 Ti. But I'm sure a 3080 Ti, should Nvidia actually be producing them and planning to release them some time this year, would sell out quickly as well. I try to go for a computer on the higher-end of things that I won't have to worry about upgrading for 4+ years which is why I got the GTX 980 Ti back in like 2015 when it released. Plus, I don't just play video games but I also like to occasionally mess around in DAZ3D to do (amateur) 3D rendering.
RDR2 is a good game. But you sound like you're a game fanatics :ROFLMAO:. Get a life if you are too much indulged in PC gaming, it is not good to your eyes in the long run if you expose them to the monitor every day for 3 to 4 hours. This advice is for your own good. You're very picky too but that's typical hardcore gamers exhibit, it is understandable. My friend who has a newly built 5600X and an outdated 1050Ti is still functioning and he has no intentions to upgrade. You should keep your GTX980 Ti.

My questions for those who are more familiar with this stuff are as follows:
1. Is an RTX 3070 good for "future proofing" a computer so that I don't have to upgrade it for 4-6 years while still being able to play games at higher graphical settings with a 1440p monitor? Is it recommended that I upgrade to a 1440p monitor from my 1080p monitor once I finally get my new computer with an RTX card? Is an RTX 3070 reliable for ultra graphics gaming such as for some of the aforementioned video games?
There is no future proofing in the hardware world. Today's GPU are built for games 2 to 3 years old. Take GTA V for instance, you can now run it well over 60 FPS on a 2K PC with the latest 3000 series of cards as well as AMD''s 6000 series of GPU. Based on this trend, any hardware no matter 3070 or 3080 even 3090 are facing at least 1 game which they can't handle with a decent 60 FPS and it could be end of the world for some gamers, there are always such enthusiasts. They can be understood. But again, we're talking about a habit - Video gaming. Worse comes to worse, switch to consoles. Or else, switch to other habits like mining which is everyone seems to be doing one o f these days. Difference is that the latter you are gaining $$$, while the former being spending BIG$$$ :ROFLMAO:.
3070 is tested to suit 1440p gaming if you're not ultra picky on FPS. In fact, 60 FPS is perfect, 29~30 is natural. By the way, do you know that 29~30 is the FPS of movies? It's also sometimes realized as console's class of performance. But if you're well-healed and have a lots of money to spend, you can buy a 3090, I'm sure nVidia will be welcoming such recommendation of mine.:LOL:

2. Is an RTX 3090 "overkill" for gaming? Would it need one of those 4k monitors or would a 1440p monitor be fine? Because as of right now, I wouldn't even mind paying the extra for an RTX 3090 if I can even manage to find one of those available for MSRP.
You're inviting envy people saying things like that!:LOL:Myself included. If ever I would have the money to buy a 3090, I will not worry anything at all! It has 24GB VRAM dude, on a 1440p monitor it can last for years. Just like 2080Ti and even the Titan is no exaggeration. Go for the RTX 3090 close to the MSRP. There are still many rich folks out there! Maybe I should find more part-time jobs!:LOL::LOL:

3.How important is monitor resolution for gaming, anyway? Let's say I have a 1080p monitor and I'm playing a game on mostly ultra settings and then I go to play that same game on a 1440p monitor where I then have to lower a lot of the settings to maybe high and medium. Does the 1440p resolution still make the game look better graphically compared to a 1080p despite how its graphical settings are set lower? I've seen how much some of those 1440p and especially 4k monitors cost and I'm trying to understand why they are worth that price. Because it seems like playing a game on a 4K monitor would mean you'd probably be playing that game with a lot of settings set to medium compared to a 1440p monitor which can play on much higher graphics settings.
You shouldn't be worrying such things unless you're considering a 3060Ti or 3070. RTX 3080 and RTX 3090 can do a pretty good job at 2K and 4K specially for 3090 which you said you can afford, just go for it man!

4. If Nvidia does release an RTX 3080 Ti, do we anticipate it being much better over a regular 3080 or are those Ti versions of the 980, 1080, 2080, etc., not actually that much different in the bigger picture of things?
Mind you! If you have read the news,. tensions are building up in the Far East. Over there, Britain, France, German, Australia, Japan and the US are having a hot party like Christmas in Piccadilly of London, naval ships are swarming the region and hot war might become a realm, you never know. Have you watched "The Day After" 1980? If the allies go to war with China because of the South China Sea disputes or the Taiwan Striaght Unification, the landscape of all these will be changed overnight. Taiwan will be burnt to ashes along with TSMC plants. We will have no cards to play then. RTX 3080 Ti is better than 3080 but you never know what will happen when the sun comes up, don't you? So not until it is released you should not assume that is available for your purchase.

If you ask me, I think it is not the best time to purchase any of the 3000 series nor 6000 series of card at the present time. Intel is working on a new GPU, let's see how it goes. Also nVidia 3000 series lack sufficient VRAm for future games except 3090 however. AMD provides more VRAM which is good but it is not very efficient in Raytracing which handicap its popularity amongst potential buyers. So if you could wait a little longer, that will be the wisest thing to do meanwhile. A piece of advice: PC gaming is no a necessity, it's getting more and more luxurious too. Tomorrow's hardware if not produced in China, will be getting more and more expensive. The good old days when we enjoy cheaper products is history. So it's time to turn the page, find another habit. Just my two cents.