Question RTX 3070 - A bad purchase?

Diegoalc

Reputable
Jun 22, 2016
22
3
4,515
0
Hello there,

The last week I upgraded from a GTX 1070 to a RTX 3070 GIGABYTE 8G, and I was exited about it.

But.. I've seen a lot of comments that claim that 8gb is enough for today gaming.. but how about future releases?

I’m a bit concern about it because the card was like 1200 USD in my country (Peru) and I upgraded because I wanted to max out future releases @1080p ultra with no plans to upgrade to 1440p. Should I be worried? I’m enjoying tittles like Modern Warfare @1080p all settings at max, but would I have to upgrade my GPU sooner than I expected? (4-5 years)

By the way, I’m playing on an i7 8700k @4.9 Ghz, 2x8 gb RAM, 750w Bronce Cooler Master PSU, Asus Z370-E MOBO.

Sorry for my bad english.

Thank you so much!
 

Diegoalc

Reputable
Jun 22, 2016
22
3
4,515
0
No one can predict 4-5 years out.
Either the game requirements, or your requirements.

Your current 3070 is near the top of the current GPU realm.

Use it, enjoy it.
Replace it when it no longer meets your needs.

And you'll likely need a replacement for the i7-8700 before a replacement for the GPU.
I think you’re right. I only hope that my GPU would be capable to handle future tittles. Thank you for your answer!
 
Reactions: spentshells

Bassman999

Upstanding
Feb 27, 2021
472
97
290
5
If you don't mind nudging details down so present and future games can manage with 8GB, 8GB should be fine for quite a while. 8GB is only going to be an issue for people who refuse to compromise.
Yeah it depends on your expectations.
Lowering details and even dropping to 1080p can keep ypu in the same card a good while.
If you can t stand lowering your settings you will be upgrading ever other year at least no matter what card you buy.
 

Diegoalc

Reputable
Jun 22, 2016
22
3
4,515
0
If you don't mind nudging details down so present and future games can manage with 8GB, 8GB should be fine for quite a while. 8GB is only going to be an issue for people who refuse to compromise.
I don't mind playing with RTX on anyways. I can turn other settings off also.. but, is the counterpart in AMD gonna perform better because of the VRAM? If so, should I change it for an AMD?
 
Reactions: Bassman999

USAFRet

Titan
Moderator
Mar 16, 2013
142,611
8,248
174,690
22,061
I don't mind playing with RTX on anyways. I can turn other settings off also.. but, is the counterpart in AMD gonna perform better because of the VRAM? If so, should I change it for an AMD?
You're obsessing over what happens 5 years from now, over a device you've had for a week.
And, if I may add, a device that is in short supply and high demand.

Your 3070 is fine.
 
Reactions: Diegoalc

andypouch

Prominent
Oct 16, 2019
64
9
535
0
Your concern is well understood. You're not the only one who ask the same question. There are quite many answers to that.:D

Quite frankly I envy you, you paid 1200 for a graphics card, that's nearly my monthly salary. I'm working part-time in a saloon becoz of the pandemic.

You said "But.. I've seen a lot of comments that claim that 8gb is enough for today gaming.. but how about future releases? " where exactly did you read the comments from?
Everyone does some research before buying a product like that expensive. Most people google the information, some watch Youtube, other may read forums posts and a lot of gamers study reviews from PCGamers.com for instance. KOL on youtube may make recommendations based on their personal experience or their so called analysis. But mind you, some of them are part of the sales network of the products. So their opinions should be taken carefully, with your own assessment towards their accuracy. Some KOLs recommend not to buy 3070 because of the VRAM limitations. Do they present you with hard figure/statistic/benchmarks? Textual reviews from reputable website are very often more authoritative and objective. Their recommendations if any, tend to be more rational, informative and you can find the answers to your questions more likely.​
I have for once, research which new 3000 series of card I should be getting. That was before the scalping and chip shortage taking place. So I do have some information collected and can share with you. Then I will present my views.​
There is one title I can think of which requires more than 8GB of VRAM. The game is called Ghost Recon: Breakpoint. (I wish to attach a screenshot, do you know how?) Anyway I give the source I learned this from, it's a youtube video:
. Warp to 08:45. You will see 3070 only yields 28 FPS at 4K. So, if you run it at 4K, that's the FPS you're getting. I for once, considered buying an RTX 3070 not long after it was released so I researched everywhere for my required information. That shows higher resolutions demand more VRAM, quite simple to perceive.​

You said "I upgraded because I wanted to max out future releases @1080p ultra with no plans to upgrade to 1440p. ". Since you have no plans to upgrade to 1440, and given that the above said game was degraded by the 3070 only at 4K, you're quite "safe" if you stick to 1080p. In fact, if you are willing to compromise, your 3070 can play many many games well over 60FPS @ 1080p.

BUT, if you turn on raytracing, you will have to make provision to a certain percentage of FPS loss. You can use DLSS 2.0 to compensate the loss though, that's the flag ship selling point of all RTX 3000 series of cards. You may expect some degradation of image quality though, like blurriness. Since you have bought the card already, you can test it out to see if you're comfortable with the image quality so produced.

If I were you, I would wait for the price to come down. Also, I will wait for other vendors. Have you heard of Intel being working on a dedicated GPU? Also, RTX 4000 series may even be better though one will have to wait for it.

Since the whole 3000 product line provides us with only 8GB ~ 10GB of VRAM on majority of models except 3060 which give us 12GB at the present time, games which demand more VRAM will render the deficiency of these cards. AMD seems to be able to foresee such trends and produce 12GB and 16GB VRAM models like the 6800 series and 6900 XT as well as the latest 6700 series. Regrettably, they are not as competent as nVidia in terms of ray tracing technologies, so ray tracing is your thing, you may cast AMD out when you are to make a purchase decision.

In the next 4 to 5 years, what games will be coming out? You may refer to this list of ray tracing games :https://en.wikipedia.org/wiki/List_of_games_with_ray_tracing_support

But in the PC gaming world, there are games more than ray tracing oriented, so it is really hard to say. As many have pointed out in their replies to your OP, you can tune down the texture details of the game. VRAM consumed by intensive texture processing in modern games. In my views, nVidia is trying to see of VRAM speed can compensate for inadequate VRAM capacity in their line of products most notably the 3080 and the forthcoming 3080Ti. I personally reckon that 3080Ti is truly future proofed. It will be equipped with 12GB DDR6X VRAM that at least can handle 1440p with pretty much to spare.

There are many people who need USD120 one-tenth of UDS1200 to spend on food and travel to work for the whole family each month especially during this pandemic period. You still have the mood to play games and buy a USD1200 graphics card, it's not too difficult for you to let go the card to your family members or friends should you find it inadequate to handle future games.

If you ask me, I don't think it is the best time to buy either a nVidia 3000 card nor an AMD 6000 card. Both sides have issues which are not addressed yet. For instance, AMD's answer to nVidia's DLSS is not yet known. nVidia card lacks sufficient VRAM to handle more texture intensive games at 1440p and 4K. AMD's 6000 cards' raw power seems attractive but if you care to think for a moment, you will find it less capable in the ray tracing department. So waiting for the next generation of cards will be my present choice.
 
Last edited:

andypouch

Prominent
Oct 16, 2019
64
9
535
0
At worst case scenarios you will need to drop the texture quality from ultra to high and the diffirence in visuals won't be noticeable.
Well said, some well-made AAA games provide many features within the Video section for gamers to fine-tune the game performance. They usually come in as a bundle of "Ultra" and "High" settings. Unless one inspects the image with a magnifying glass, one can hardly differentiate the difference.

Some however, are quite sensitive. It really depends on the game you are really into. That's why I usually research how well the "I'm going to buy" card reacts with the "game I'm going to buy" before I make a final decision. This process is painful and time consuming but well worth it. Jumping to buy both items hastily easily makes regrettable mistakes.

But as you said, worse comes to worse, drop the texture quality will save the day always!:D
 

Eximo

Titan
Ambassador
Some of the Ultra features can even be distracting depending on the game type. I recall an old Crysis 2 issue that would make certain terrain types reflective and shiny when they certainly weren't supposed to be.
 

InvalidError

Titan
Moderator
Well said, some well-made AAA games provide many features within the Video section for gamers to fine-tune the game performance.
Nearly all PC games regardless of "A-ness" have graphics options that can be tweaked for taste and performance. Many browser and even idle games do too.

As far as I am concerned, I play games with whatever GPU I happen to have at the moment, sacrifice whatever details I may need to in order to make it playable and, if I like the game enough to want to see it in again in higher details later, I'll play through it again with most settings nearly maxed out once I get my next PC.
 

Macif

Honorable
Oct 19, 2014
48
2
10,545
1
If you only plan to stick to 1080p I'd argue that you're pretty safe. Of course nobody can predict what that happens the next 4-5 years. But like others have pointed out you can always tweak the settings as needed.

I'm a 1080p guy myself, and my last computer used a gtx 970 for around 6 years. And I didnt really notice anything bad until late 2020 when it came to my computers performance. And that was a card with only 4gb. Just to give you a comparison. But to be fair I had some other components too etc that could take the blame there. Got a 3070 myself, on a new computer now.

I'd personally argue that there is a difference between necessity and enthusiasm. Some people are content with using the card until it cant provide the result they expect anymore. While some may consider it the end of the world if the game drops with 5-10 FPS. Or they feel the need to buy every new card that comes out. I recall a thread I found on some other forum before. Where some people argued that 8GB wasn't enough for a card in 2020. Unless you were willing to game at 1080p. At the end of the day of course to each their own as they say

And also as stated before raytracing will drop the FPS a bit. So of course it then also depends on what features you consider a «must» for your enjoyment. Personally I dont see the big deal with raytracing, but again, that depends on the person.

For instance I recently played RE7: Biohazard and on one level I noticed a fast brief freeze thing going on when I was in a dark area with shadows. After looking it up I found out that the game used a special setting to render shadows. And a few people reported that they noticed similar things. The solution was to disable it. Since it used up the cards memory apparently. Did it change the shadows or the game to the worse? No. It felt exactly the same. Except no fast brief freezes. Is the 3070 a bad card because it couldn't keep up? I dont feel that. Instead I dont understand why the game had such a setting to begin with. When the regular shadow option worked just as well.

If my memory isn't wrong, at 1080p the computer depends more on the CPU than the GPU to begin with. So if the 3070 works I'd say just keep it. It ain't easy to come by the new cards and the prices ain't doing anybody a service either.
 

InvalidError

Titan
Moderator
If my memory isn't wrong, at 1080p the computer depends more on the CPU than the GPU to begin with.
It always depends on the game. Lightweight games like CSGO reach crazy frame rates as far as the CPU is concerned, it only gets its "CPU intensive" tag because competitive players are obsessed with 300+fps for faster response time which would explode most games' CPU requirements. Pushing graphics to ludicrous can drag high-end GPUs down to their knees even on a relatively low-end CPU.

The only absolutes are your minimum playable performance and graphics quality requirements.
 
Reactions: Macif

Macif

Honorable
Oct 19, 2014
48
2
10,545
1
It always depends on the game. Lightweight games like CSGO reach crazy frame rates as far as the CPU is concerned, it only gets its "CPU intensive" tag because competitive players are obsessed with 300+fps for faster response time which would explode most games' CPU requirements. Pushing graphics to ludicrous can drag high-end GPUs down to their knees even on a relatively low-end CPU.

The only absolutes are your minimum playable performance and graphics quality requirements.
Good to know, then I've learned something more about the subject myself. And I couldn't agree more on your last point.
 

andypouch

Prominent
Oct 16, 2019
64
9
535
0
Macif: " It ain't easy to come by the new cards and the prices ain't doing anybody a service either. "

OP is very fortunate and those who own a 3070 at this difficult time when the crypto mining is booming. I saw in a retailer shop they have put together a custom rig in which 6 MSI 3080 was in place! I don't know how much exactly the profit they are getting apart from do their core business, but that's where you can see how crazy the trend is at least. Also, TSMC has delayed their delivery to AMD and possibly nVidia(not fact-checked yet). So the availability of GFX cards are not clear yet. TSMC has decided to invest 1,000 billion to rebuild their production chain across the globe. This will definitely take some time to complete. Also not clear is that how US is reacting to China's threat against Taiwan. In the event of a war which burns the island to ashes, we might not have any graphics card to play around.:unsure:

So unless you can't bear anymore the performance of your 3070, better keep it. As long as you stay with your 1080p preference, you will be quite happy with what you are getting. In fact, getting used to a certain resolution is simply a matter of 3 weeks of using it. Once you get used to it, you will love it. It opens you to more options too. For some time I thought about getting a 3080. I am using a 1080p monitor. In fact I just purchased a BenQ EX2510 last week. It's a 25". Moving from a 22" to a 25" with the same resolution is a great great experience. Plus the fact that it has an emulated HDR 8-bits color pretty decent color gourmet and a fast response time 48~144Hz, I have no worries about any game running at nothing above 60FPS except a few titles like Watch Dog Legion. Given the power of the 3080, I thought it would be an overkill choice so I gave it up.

If both 3070 and 3080 can't handle the game, the coding is to be blamed. Crysis 2 and Crysis 3 repaint all the polygons within every refresh of the ground that's why even you use a modern GPU like 3080, you still not getting very high FPS figures. But if you look at The Division 2, my current rig can produce fluent 60 FPS without any issue or artifacts. It all comes down to the coding and how optimized the game is at the end of the daya Today's games still make use of the CPU for calculating the scene and constructing each frame for it. GPU will "paint" the final image. Given this chain of processes, you don't really need a 3070 to play on 1080p resolution. In fact, a 3060Ti may very well be the sweet spot. High end 3060 Ti can be overclocked to 3070's performance like the ROG STRIX version which has a very high power limit which means headroom for extreme overclock.

But as I said before, 3000 series still lacks the right combination of VRAM configurations. AMD has its problem with RT. So it's not the best time to purchase either nVidia or AMD cards at this point. But as you have already owned one, keep it for at least 5 to6 years. Maybe, after 5 years, your interest in PC gaming diminish. You might become a home miner:ROFLMAO:

My two cents.
 
Last edited:

ASK THE COMMUNITY