[SOLVED] Can we talk about the 3070 RTX and 8GB of VRAM for 3440x1440 res gaming in 2020?

Dec 16, 2020
20
1
15
Hello!

I am specifically looking for information and opinions from people who monitor VRAM usage in 2020 titles... Preferably on Ultrawide 3440x1440 resolution. Would you say the 8GB offered on the 3070 is enough for 1440p Ultrawide gaming going forward?

I have a 3440x1440 monitor, which is not your typical 1440p setup - I believe the pixel count is like almost 30% higher for such a monitor compared to typical 1440p. The 3070 was marketed as a perfect "1440p card" which of course, I took at face value, not really minding the fact that I am running ultra-wide 1440p and not standard 1440p res.

The problem is that I am afraid it will not be future proof at all with 8GB VRAM for the 3070 RTX to play on this resolution. What is the point for me to have this card if I cannot utilize its full potential because of the VRAM being throttled due to the increased texture sizes for this resolution? Am I just panicking here and dead on wrong, or am I actually correct about this being a limiting factor? What are the maximum VRAM requirements for 3440x1440 for current games?

I understand games like RDR2 use only 6,2gb~ of VRAM on 4k - but that game is an older title already, even if the visual fidelity is actually insane. I am not going 4K... But 3440x1440 still requires a lot, more so than regular 1440p.

I didn't monitor RDR2 and the VRAM usage, as I did not have any problems, but I did notice problems arise on Cyberpunk 2077. How can we even talk about future proofing if I can ALREADY see that my card is capping out in this game at 7,9GB, at which point the GPU usage suddenly spike-drops and I experience stutters for like 1-2sec.

I understand Cyberpunk is poorly optimized, etc. But I sincerely doubt they will somehow optimize it to use less VRAM. It seems to me that plain and simply, this card is mostly a 1080p gaming card ar max 1440p standard res.

So... 8GB... Is it actually enough as people were saying upon the reveal of the cards? Personally, I just dissmissed the people who criticized the 8GB fact... as being elitist. But is that really the case? Honestly, the VRAM size didn't change from my 1080 which also has 8GB. Naturally, I am running more GPU intensive settings with a 3070 than on a 1080, which is why there's more VRAM needed... right?

Honestly, I did not expect this to be an issue, since I didn't think Nvidia would be so bold as to undercompensate (intentionally or not) VRAM on a card that is capable of running modern titles @3440x1440 quite well on higher settings. Yet, if the VRAM caps out... what's the point of that power?

Am I just being too demanding of this card? I honestly had no other option to upgrade my gtx 1080 going into new gen titles as the only cards that were available were 3070s... I haven't seen any 3080s in stock yet, but there are plenty of 3070 in my area.

I could scavenge and perhaps try to sell my 3070 for around the same price I bought it for or even a tad more. Yet, I have no options for getting a 3080 right now if I were to replace my 3070. I wasn't really that much of an elitist about going for the 3080 since I don't attempt 4K and I can deal with some lower frames for the 2x lower price of the 3070 in my area...

Perhaps I should look at the 2080 TI used in this case? The 11gb VRAM would probably be more than enough for my needs... And the performance is about the same as a 3070? I do like RT ON though. From what I understand, the 30 series cards have better RT performance...?
 
Solution
According to rumor, the 3080ti will be msrp $999, which will most likely drop the price of the 3080, and thats IF they don't drop the 3070/3080 altogether, in favor of the 3070/3080 Super models, the way they did with the 2070/2080 models.

Trying to predict the future is a fools game at any level, I'd not have bought my 2070Super if I had known that in just 8 short months with the release of the 3070 for the same price I'd get a quite superior card, basically going from a 2070S to a 2080ti level card.
Dec 16, 2020
20
1
15
I guess I'll answer my own question after some research:

No, most likely, as it is already not enough for certain titles, as shown in some benchmarks.
 
Last edited:
A lot comes down to how important is it to max out game settings. I’m running a 3080 at 1440p 144Hz but I don’t use max settings in many games so I can average 120+ FPS. I don’t think I’ve seen VRAM usage over 6GB in my games. I find turning down a few settings makes nearly no difference to visuals but a noticeable difference to performance.
 
  • Like
Reactions: renz496
Dec 16, 2020
20
1
15
A lot comes down to how important is it to max out game settings. I’m running a 3080 at 1440p 144Hz but I don’t use max settings in many games so I can average 120+ FPS. I don’t think I’ve seen VRAM usage over 6GB in my games. I find turning down a few settings makes nearly no difference to visuals but a noticeable difference to performance.
Are you 3440x1440p tho? It's about 30% more pixels than regular 1440p which I imagine draws more VRAM as well. I did turn down a lot of the settings in Cyberpunk but it seems no matter what the RT settings being on draw like 1-2gb of VRAM on their own (if not more).

I saw there's also similar reports from benchmarks regarding other titles, if you look at hardware unboxed, they were even attempted to be spanked hard by nvidia for talking some spicy stuff about them.
 
just don't use ultra. problem solve. i see some people just want to use ultra just for the sake of it rather that being practical/realistic with it like "what's the point of playing at 4k if i did not max out everything?" or "i did not buy this hardware for x amount of money so i can turn down some settings". and then sometimes i see people really try to push the graphic real hard so the game will use massive amount of VRAM to prove their point. in fact the game can look just as good looking as ultra using high or very high.
 
Dec 16, 2020
20
1
15
Well, The only options for textures are low/medium/high. I am not unwilling to compromise other settings, but I did follow the digital foundry/tech city optimal settings as per their analyses, so everything is not cranked out on max. I suppose I should try the medium preset + rt on.
 
Maybe it is low, maybe it's enough. In the end, since you already bought it, makes no difference discussing about it. If you are happy with the performance you need to accept the fact that someday you will have to turn down more than one setting . If you are not... Sell it and wait for the next release that will have better performance and more VRAM.
 
Dec 16, 2020
20
1
15
Well, I am simply discussing about the implications of a 3070 with 3440x1440. It's clear to me the card was advertised as a 1440p card, but it is not. So, by making a discussion, I am in hopes that people will avoid the same mistake I have made.

But whatever. Not like you can get these cards easily atm anyways. I might go for a 3070 ti once they come out.
 
It is a 1440p card and a very good one. Cyberpunk is a VERY unoptimised game and should not considered the norm.
The fact that games now don't need more than 8GB of VRAM for 1440p doesn't mean that in the future AAA games won't need more. It was like back when 1060 came out and Nvidia claimed it was perfect for 1080p gaming. It was. It's not anymore.
 
  • Like
Reactions: Phaaze88
Are you 3440x1440p tho? It's about 30% more pixels than regular 1440p which I imagine draws more VRAM as well. I did turn down a lot of the settings in Cyberpunk but it seems no matter what the RT settings being on draw like 1-2gb of VRAM on their own (if not more).

I saw there's also similar reports from benchmarks regarding other titles, if you look at hardware unboxed, they were even attempted to be spanked hard by nvidia for talking some spicy stuff about them.
I’m running actual 1440p and not 3440x1440 and aware of the uplift in pixels but it doesn’t directly correlate to increased vram although there will be an increase. My point however is you can usually reduce game settings with minimal impact to image quality but more significantly reducing gpu resource requirements.

The thing with hardware unboxed seemed to be nothing to do with VRAM but more about they predominantly use games without Ray Tracing in their reviews. It seemed NVidia wanted them to push RT and other features which still don’t apply to the majority of games.

Is 8gb enough for 3440x1440? My guess is if wanting the highest settings possible and running 60fps then I can believe you are finding it limiting. If you had been aiming for 100-120fps at medium settings in most games you would more likely have no issues. I do think NVidia have been a little tight on vram but also expect adding another 2gb and maintaining margin could have put an extra £50/$50 on RRP. It may have also increased problems with supply chains if each card needed an extra 25% vram.
 
  • Like
Reactions: dotas1 and Phaaze88
Dec 16, 2020
20
1
15
I’m running actual 1440p and not 3440x1440 and aware of the uplift in pixels but it doesn’t directly correlate to increased vram although there will be an increase. My point however is you can usually reduce game settings with minimal impact to image quality but more significantly reducing gpu resource requirements.

The thing with hardware unboxed seemed to be nothing to do with VRAM but more about they predominantly use games without Ray Tracing in their reviews. It seemed NVidia wanted them to push RT and other features which still don’t apply to the majority of games.

Is 8gb enough for 3440x1440? My guess is if wanting the highest settings possible and running 60fps then I can believe you are finding it limiting. If you had been aiming for 100-120fps at medium settings in most games you would more likely have no issues. I do think NVidia have been a little tight on vram but also expect adding another 2gb and maintaining margin could have put an extra £50/$50 on RRP. It may have also increased problems with supply chains if each card needed an extra 25% vram.

You may be right, and I can see the devils advocate position from that POV. But coming from a 1080 GTX, with also 8GB, I am finding it a bit concerning that a current gen card is released with 8GB. In EU we are already paying a lot for this card. From the Nvidia POV the 8gb decision makes sense... from the consumer POV? It's quite harsh. I would let it slide but it's not even the faster memory found on 3080/3090. I do tune the settings down but it's way too easy to push this card over its limit IMO, especially since it's VRAM, something you can't boost/overclock no matter how much you want.
 
You may be right, and I can see the devils advocate position from that POV. But coming from a 1080 GTX, with also 8GB, I am finding it a bit concerning that a current gen card is released with 8GB. In EU we are already paying a lot for this card. From the Nvidia POV the 8gb decision makes sense... from the consumer POV? It's quite harsh. I would let it slide but it's not even the faster memory found on 3080/3090. I do tune the settings down but it's way too easy to push this card over its limit IMO, especially since it's VRAM, something you can't boost/overclock no matter how much you want.
I do see your point and to some extent agree
 
Seeing as how there are many different opinions on this issue and I'd rather have more than I need instead of not enough or just barely enough (and who knows, maybe I have more money than sense), I have put off buying a RTX 3080 and am patiently waiting for the not-yet-officially-announced (but quite likely) RTX 3080 Ti with 20 GB of VRAM. I play MSFS 2020 with VRAM-hungry mods at 4K, so a GPU with a lot of VRAM makes a lot of sense for someone like me.

But to answer the broader question you asked, its up to you and what your needs are and what your experiences have been. Is your GPU making you happy the way you want to use it? If not, are you willing to make quality compromises in order to get better performance? If not, than you should upgrade. Wait for a RTX 3080 to become available and then sell your RTX 3070 for a minor loss.
 
Dec 16, 2020
20
1
15
Seeing as how there are many different opinions on this issue and I'd rather have more than I need instead of not enough or just barely enough (and who knows, maybe I have more money than sense), I have put off buying a RTX 3080 and am patiently waiting for the not-yet-officially-announced (but quite likely) RTX 3080 Ti with 20 GB of VRAM. I play MSFS 2020 with VRAM-hungry mods at 4K, so a GPU with a lot of VRAM makes a lot of sense for someone like me.

But to answer the broader question you asked, its up to you and what your needs are and what your experiences have been. Is your GPU making you happy the way you want to use it? If not, are you willing to make quality compromises in order to get better performance? If not, than you should upgrade. Wait for a RTX 3080 to become available and then sell your RTX 3070 for a minor loss.

We'll have to see. I suppose 8GB should be enough until a 3070 ti comes out..? I am wondering if that will even happen. We know for sure 3080ti will happen, but that will be twice the price of 3070. I doubt I will slap that much money on a video card tbh. Though, a 3080 or 3070 ti should be neat. I know for a fact I could sell my 3070 for probably even more than I bought it new... But I won't get a 3080 deal that's decent anytime soon I bet. I'll just be on the lookout.

Also, there seems to be some memory leaks involved in Cyberpunk, as I did some more testing. It may be so that 8GB is enough for that title for the settings I'm running. Either way, it's clear my setup needs more than 8GB of VRAM. Anyone have an idea of how much VRAM titles eat up 3440x1440 compared to regular 1440P or 1080p? Are the differences huge?
 

Karadjgne

Titan
Ambassador
When it was first released, the 1080 was basically top of the line for enthusiast class mainstream cards, back when vram usage was chump change and 1080p would run all day long on 3Gb in any game.
The 3070 isn't top of the line, it's below the middle. 3060ti, 3070, 3080, 3090... and that's not counting the possible release of a 3080ti.

3060ti is a good 1440p card. The 3070 is a better 1440p card, somewhat equitable to a 2080ti, so can also handle 4k. 3080 is as good as it gets for 1440p, but also does better than well in 4k. 3090 is best, but not worth the upgrade from a 3080.

The issue with presets like ultra is that they don't just contain gpu settings, they also contain cpu bound settings, like viewing distance, grass detail, lighting affects etc. Those settings affect the disposition of objects, Ai for npcs, mechanics, physics etc.

So set ultra if you wish, that'll for sure maximize gpu details, meshes, textures, maps etc, but also customize in game settings to lower the cpu bound settings, which gets you the best fps at the best visuals.
 
Dec 16, 2020
20
1
15
When it was first released, the 1080 was basically top of the line for enthusiast class mainstream cards, back when vram usage was chump change and 1080p would run all day long on 3Gb in any game.
The 3070 isn't top of the line, it's below the middle. 3060ti, 3070, 3080, 3090... and that's not counting the possible release of a 3080ti.

3060ti is a good 1440p card. The 3070 is a better 1440p card, somewhat equitable to a 2080ti, so can also handle 4k. 3080 is as good as it gets for 1440p, but also does better than well in 4k. 3090 is best, but not worth the upgrade from a 3080.

The issue with presets like ultra is that they don't just contain gpu settings, they also contain cpu bound settings, like viewing distance, grass detail, lighting affects etc. Those settings affect the disposition of objects, Ai for npcs, mechanics, physics etc.

So set ultra if you wish, that'll for sure maximize gpu details, meshes, textures, maps etc, but also customize in game settings to lower the cpu bound settings, which gets you the best fps at the best visuals.

I would agree with you but the 1070 and 1080 were both 8GB cards back in the day... Do you mean to tell me now that 1070=3060ti or what? The 3090 is the new titan, the 1080 is the 3080 still (there was also a 1080 ti) and 3070 = 1070. But as I've said, the VRAM is holding this card back, I can clearly get into game scenarios where it becomes a limiting factor on it, as shown here for example:



You can see in the benchmark what I mean. The 2080 Ti has more VRAM, and it doesn't have that problem. What is seen in the video is exactly what I recreated as well.

This card is vastly limited by the VRAM, especially if you want to game at 1440p Ultrawide or 4K (but let's be honest, people are going 4K with the 3080 at least). Still, as you can see, if the card treads the VRAM limit for some titles already... it's not future proof at all. I will definitely be selling it at the first opportunity.

And, like, how would you explain the rumors of 3060 having 12GB of VRAM? This isn't confirmed, I sincerely doubt that's the case. It's probably 8GB as the other cards. But I don't want to live in a world where the 3060 has more VRAM than the 3070. That would be completely bonkers.
 
Last edited:
The problem is that you expect too much from a gpu that is meant to be mid tier. If there were no 3000 series and you bought a gpu from the 2000series, with the same amount of money you spent on the 3070, what would you get?

I do agree that the 3070 and 3080 should have +2GB of VRAM each , but they do not. You knew that before you decided that 3070 is the GPU to go, for you.
 
Dec 16, 2020
20
1
15
The problem is that you expect too much from a gpu that is meant to be mid tier. If there were no 3000 series and you bought a gpu from the 2000series, with the same amount of money you spent on the 3070, what would you get?

I do agree that the 3070 and 3080 should have +2GB of VRAM each , but they do not. You knew that before you decided that 3070 is the GPU to go, for you.

IMO I'm not expecting too much from this card - especially not if the card is literally throttling itself due to not having enough memory to work with. What will you say if the 3060 has 12gb of vram if the rumors are true (albeit slower memory?) That Nvidia is providing too much memory for such a low tier card?

I made the choice for a card as this was the only option in my region. There was literally no choice other than maybe the 3060 ti. It was that or stay with my 1080. I chose to get a new card because I can resell it later (I wouldn't need to if it didn't have a laughable amount of VRAM).

I am just pointing out, that this card has a severe flaw if you game above 1080p. It's not about me or about what I want. It's about the card being limited by the VRAM in some scenarios because by design the card is doomed for future titles that require at least 7-8GB+ Vram. I mean there has to be at least some headroom to work with. It's absolutely bonkers if the card is capable of running titles at 50-60 fps which is quite acceptable if you aim for more fidelity @ 60fps rather than performance @closer to 144, but then gets stutters and FPS drops once the VRAM runs out... This just means that you're being deprived of potential to run the game in a stable manner due to the VRAM. Whether you tune down settings, or deal with it another way, it still doesn't change the fact that this shouldn't be the case for such a card in 2020. Not by Nvidia standards.

All these years, for the longest time, Nvidia gave lots of leeway for VRAM - 1070/80 had both 8gb. I mean you know the 1070 and 1080 were basically the choice of cards between people who want more FPS, there was no difference in VRAM. The 1080 ti was the high tier one. The difference is, you don't have much raw power to actually run out of VRAM as the FPS will probably be unplayable first on the 1070/1080. This isn't the case with 3070.

Hell, if you look at benchmarks, even some older titles push some of these VRAM limits on 3440x1440. (which is technically still 1440p).

For the 3070, you will basically need to tune down settings to find a sweet spot so that your card isn't limiting itself. I guarantee you there will be situations where you need to look for the right settings and potentially lower even textures to lower settings, all because Nvidia skimped the memory. I guarantee that were there more headroom for VRAM on the 3070 it would not have any stutters as shown in the benchmark I linked, for example.

The stutters will be BAD in open world, big and vast titles such as Cyberpunk, I've already tested it with settings tuned DOWN for this title, it can easily go over 8gb of VRAM on 3440x1440. But there's also memory leaks so I cannot confirm this 100%. It may happen regardless of hardware if the memory is indeed leaking. I am not denying that the 8GB could be standard and playable for optimized titles for 2021. But I sincerely doubt you will run out of performance first, rather than memory, on this particular card...

All in all, I don't mind having this card personally. I can be satisfied until I find a more suitable and future proof variant. I do mind when people yell that you're being elitist, games won't need that much vram for at least 1-2 years, etc. when it's factually incorrect. And I do care when Nvidia are becoming more and more bold with these types of moves in terms of new gen GPUs. I don't believe there was a generation like this yet, where the memory was treading the line so hard on 70/80 cards. I would have made the shift to AMD this generation if not for Ray tracing advancements by Nvidia. Funnily enough, RT settings often eat even more VRAM, so this card is literally better off running RT off... even tho the 30 series is marketed and praised for their RT/DLSS. Laughable!
 
Last edited:
I made the choice for a card as this was the only option in my region.
Noone forced you to buy the card at that moment though. You could have waited like almost all other people did, and get a 3080, 3080Ti, 3070Ti in some months that would have better performance and higher VRAM.

What will you say if the 3060 has 12gb of vram if the rumors are true (albeit slower memory?)
I highly doubt that it would have 12GB or VRAM, but even if they do, it's not like I agree with Nvidia. They made whatever choice they made. It's up to you to decide what you will buy. I am neither a representative, nor a fanboy of either company.

Hell, if you look at benchmarks, even some older titles push some of these VRAM limits on 3440x1440. (which is technically still 1440p).
The benchmarks were done in very unoptimised games, I believe we can agree on that. Almost every Ubisoft game is crap when it comes to optimisation. Just like Crysis is and now Cyberpunk. I tried recently Far Cry 3 in my PC which was given to me as a present from a friend and it run worse than Far Cry 5 that I own. That was in real 1440p and not ultrawide. That is my other point. I don't think ultrawide 1440p is still 1440p. It has 34.375% MORE pixels than normal1440p. It's like you are saying that two 1080p monitors that have a total res of 3840x1080 is still 1080p.

For the 3070, you will basically need to tune down settings to find a sweet spot so that your card isn't limiting itself. I guarantee you there will be situations where you need to look for the right settings and potentially lower even textures to lower settings, all because Nvidia skimped the memory. I guarantee that were there more headroom for VRAM on the 3070 it would not have any stutters as shown in the benchmark I linked, for example.
I agree but again, that's what I have been saying from the start. You could have waited and see how things were at your resolution with that card instead of just buying whatever it was available at the time.It was your choice. Nvidia has not given enough VRAM to 3070 but NOONE is forced to buy it and ofcourse, not everyone will use it to play at 1440p ultrawide.

I do mind when people yell that you're being elitist, games won't need that much vram for at least 1-2 years, etc. when it's factually incorrect.
You can just ignore them and think whatever you like, be it wrong or right.

And I do care when Nvidia are becoming more and more bold with these types of moves in terms of new gen GPUs.
That's why AMD have gone extra gracious with their VRAM in the 6000 series. Nvidia will follow.

Funnily enough, RT settings often eat even more VRAM, so this card is literally better off running RT off... even tho the 30 series is marketed and praised for their RT/DLSS. Laughable!
You mean just like the first gen Ray Tracing and DLSS? They are just beginning and like it or not, early adopters are again for me to the 3000 series and not just the 2000.
 
What I forgot to say in the above post is that EVERY card has a limitation. In the case of 3070 it's MOSTLY the VRAM. 2080 Ti has power and thermal limitations. Others are limited by the architecture, shader cores, RT cores and so on. Again, I agree that it has been falsely advertised as the perfect card for 1440p, but it's a damn good one.
 
Dec 16, 2020
20
1
15
Noone forced you to buy the card at that moment though. You could have waited like almost all other people did, and get a 3080, 3080Ti, 3070Ti in some months that would have better performance and higher VRAM.


I highly doubt that it would have 12GB or VRAM, but even if they do, it's not like I agree with Nvidia. They made whatever choice they made. It's up to you to decide what you will buy. I am neither a representative, nor a fanboy of either company.


The benchmarks were done in very unoptimised games, I believe we can agree on that. Almost every Ubisoft game is crap when it comes to optimisation. Just like Crysis is and now Cyberpunk. I tried recently Far Cry 3 in my PC which was given to me as a present from a friend and it run worse than Far Cry 5 that I own. That was in real 1440p and not ultrawide. That is my other point. I don't think ultrawide 1440p is still 1440p. It has 34.375% MORE pixels than normal1440p. It's like you are saying that two 1080p monitors that have a total res of 3840x1080 is still 1080p.


I agree but again, that's what I have been saying from the start. You could have waited and see how things were at your resolution with that card instead of just buying whatever it was available at the time.It was your choice. Nvidia has not given enough VRAM to 3070 but NOONE is forced to buy it and ofcourse, not everyone will use it to play at 1440p ultrawide.


You can just ignore them and think whatever you like, be it wrong or right.


That's why AMD have gone extra gracious with their VRAM in the 6000 series. Nvidia will follow.


You mean just like the first gen Ray Tracing and DLSS? They are just beginning and like it or not, early adopters are again for me to the 3000 series and not just the 2000.


I kind of agree with your points. They are indeed some unoptimised games. I guess time will tell if Cyberpunk gets the memory leak fixed and whether the game is actually capping my VRAM or not. I am just dissapointed to have also been one of those "8gb is enough!" elitists. My expectation was that for my resolution 3440x1440 it will be enough, since 3440x1440 is 4.95 Megapixels and 4K is 8.29. That's almost double, so I was sure it will be enough.

It's a damn great card, that's for sure. But it would be much much better if I ran it 1080p. If the pricing situation was different I would have definitely went for a 3080. But that's a very big maybe. The 3080 is MUCH more in demand than the 3070, from what I've heard from a few friends that work in the industry and did some ordering around of the cards for some online e-shops in my country. Even then, the cheapest they could offer me my 3070 was around 670 euros. Via people i "know". Can you imagine??? Also, what pricing would you expect a 3070 Ti to be at? And what would be actually a fair price to overpay for the 3080 you think? I am from the EU btw.

Any in any case, even if it's a great card, it's just weird to have it 8GB. Look at various benchmarks for older titles even, for cards that have more than 8GB VRAM for 1440p. Some games literally gobble up the whole 8gb with some 500Mb additional or less give or take. It's just so wrong and I think for this sole fact the 3070 may be the worst choice between the 30 series cards.
 
Last edited:

Karadjgne

Titan
Ambassador
Well much of this is based on current assumptions. If you follow what nvidia had done in the last 2 generations, releasing ti and/or super versions and discontinuing the base model, then we may very well see a 3070 ti or a 3070 super that does have extended vram of 10 or 12Gb. The 3080ti is supposed to come with 20Gb of vram according to rumor, so who knows.
 
Dec 16, 2020
20
1
15
I am looking at possible options to replace the 3070, sell it for around the same that I bought it for and purchase a 3080. I bought the gigabyte gaming 3070 for around 690 euros. It seems the 3080 is sometimes available, but the price is around ~1k euros. Would you guys say it is worth paying an extra whopping 300 ~ euros just for the 3080? I am simply weighing my options here, I am not in a rush but I suppose the longer I keep the 3070 the less value it will retain. Right?
 
since 3440x1440 is 4.95 Megapixels and 4K is 8.29
It's not double, it's 2/3rds more, or to be precise, 67.44% More. But yeah, I get your point.

It's a damn great card, that's for sure. But it would be much much better if I ran it 1080p
I am not a big fan of going to that lower resolution but at least for the games that are unoptimised, you can try the "normal" 2560x1440 and see how they are performing.

Also, what pricing would you expect a 3070 Ti to be at? And what would be actually a fair price to overpay for the 3080 you think? I am from the EU btw.
Strictly personal opinions following:
I believe the 3070Ti will be somewhat closely priced with the 3080. Like 629$-649$ or if we are lucky exactly between the 2, at 599$. That's MSRP though, expect going much more than that in shops. As for the overpaying, I am not the guy to ask, as I tend to buy only if I don't pay too much for what is worth. So when you say 1000€ for a 3080 is not a good price for me, not even by a long shot. Were it somewhere the 800-850 mark, then yeah.