News GeForce RTX 4060 Ti Retailer-Listed Specs Look Worse Than RTX 3060 Ti

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
May 4, 2023
3
2
15
I'm getting so tired of PC review companies looking at only specific things and just ignore the reasons behind changes. Memory amount, memory bus and number of cores? It means absolutely NOTHING compared number to number, when you are comparing 2 entirely different architectures. The 4080 on paper is lacking in every single one of those categories as compared to the 3090ti and absolutely stomps on it in performance. Why? Because the cores are all newer generation. The architecture has far far more transistors and is much more energy efficient. The bit bus doesn't need to be massive on the newer architecture because everything is more optimized. It's like comparing zip compression 25 years ago to now, it is far more efficient now. It may be a smaller bit bus, but it is transferring just as much and possibly more than it did on older generations specifically because of optimization. The GPU as a whole is also clocked faster. It runs at a faster rate, like 400 less cores doesn't mean anything when they run almost 30% faster than the models. The only change a bigger bit bus would cause in almost all these models is higher temperatures because they don't actually need these massive bit buses when things are properly optimized. There is a clear cause and effect history between the two regardless of the actual delivered performance. Optimization of the components allows the required bit bus size to be smaller.
As someone with a 4080, I can see how much VRAM I'm using in games, the model has 16GB as compared to the 24GB of the 3090ti. Most games even on max settings I'm using maybeeeeee 9GB of VRAM at the very most (with raytracing), It has never gone above 54% VRAM utilization in any game, the GPU performance has been up to about 72% at a consistent rate with spikes up to 80% which is on one single game and this drops about 15-20% or so if I turn off ray tracing . Yet one of my older graphics cards on low settings for Call of Duty MW would use about 7GB for low-mid settings without raytracing. The game using 9GB on my 4080 is Gotham Knights which has far higher requirements than a 4 year old Call of Duty. It is all about optimization and that just constantly seems to keep getting over looked. The 4060 looks worse on paper because it doesn't need to look better on paper. It's actual ability to run with it's better architecture and newer generation components work better and are more optimized. It doesn't need to have more VRAM or a bigger bit bus. It is optimized to not need those high numbers. Everything is just made to work better without the need for those giant numbers.
 
  • Like
Reactions: artk2219
8GB VRAM is still plenty for todays games. The vast majority of games dont need that much. Are there any games that even exist with 12GB system requirements, most are still 4-6GB and 8 for recommended. No, you wont play Hogwartz at 8K 480FPS Max settings RT on a 4060Ti. But a lot of these high VRAM games are because of poor optimization. Hogwarts runs on a PS4 with 8GB UNIFIED. Yes, its not the prettiest, but they made it work. The responsibility is on game developers to design their games properly. We shouldn't have to all be running 4090's because of poor game development. Any game in which the dev put in the time and effort for optimization should have no problem with 8GB now, and into the near future. If 16GB was the standard like a lot of people are claiming, and NVidia actually went with that, the next XX50 card would be $899 lol. It would be insane. a 60 tier card does not need 12GB. what they did on the 3060 was stupid. They even made a 2060 12GB. Stupid, and totally unnecessary. 12GB shouldn't start until XX70, and thats where we are at. There is nothing wrong with 8GB at 60Ti, 60, and 50Ti. 50 at 6GB. VRAM isn't cheap and the cost is passed onto the consumer. You dont want more expensive GPUs do you? If games are properly optimized, even the latest games would run fine on 8GB VRAM at 1080p max settings, RT, 120FPS, with the help of DLSS. Your 60Ti's would work just fine and we can all have semi-affordable GPUs. Its optimization that's the problem
FWIW the minimum VRAM requirement on Jedi Survivor is 8GB, but a 580/1070 which means the 1660s and 2060s are both superior cards but technically don't meet the minimums.

I certainly agree with you that 8GB should be plenty for everything, especially at 1080p. Not enough people are holding developers accountable and instead pointing at the cards themselves. Realistically speaking adding VRAM isn't particularly expensive now that 2GB chips are available, but that's only new cards and the majority of the market doesn't upgrade a lot.
 
  • Like
Reactions: artk2219

artk2219

Distinguished
Works other way around as well...and why not both, upgrade GPU for this piece of...something...or get a console, buy some HDMI switch and play whatever you want.
I've seen a few xbox series s's selling for sub $150 second hand recently, thats a really solid proposition at that price point on its own, and definitely throws it into the why not have both category as its less than half the msrp of even the 3060 ti.
 
May 4, 2023
3
2
15
FWIW the minimum VRAM requirement on Jedi Survivor is 8GB, but a 580/1070 which means the 1660s and 2060s are both superior cards but technically don't meet the minimums.

I certainly agree with you that 8GB should be plenty for everything, especially at 1080p. Not enough people are holding developers accountable and instead pointing at the cards themselves. Realistically speaking adding VRAM isn't particularly expensive now that 2GB chips are available, but that's only new cards and the majority of the market doesn't upgrade a lot.
This isn't entirely correct. The minimum VRAM amount is in reference to, in this case a 1080, the minimum requirements GPU. The 1080 as compared to now is an old architecture base. 8GB on it, is contrary to very popular belief not equivalent to 8GB on a newer card. Well especially not so considering it is GDDR5. This is a complete misconception. Newer cards are better at handling their VRAM and optimizing their processes to require less VRAM. Plus the recommended just like minimum are both 8GB. The difference being the recommended is a 20 series card which uses GDDR6. On any card newer than a 10-series chances are on minimum requirements, it isn't gonna hit 8GB of VRAM, newer cards are too optimized for that. Probably only going to hit that on settings where older cards can no longer run it smoothly anyways, or in the case of the new 40-series cards, while using ray tracing.

The recommended specs for Gotham Knights which is high settings on 1080p/60FPS is also 8GB of VRAM. On my 4080, playing it in 4K, with my render scale set to 200%, with ray tracing and everything set to the max, it uses 9GB at the most of VRAM usually a bit less. VRAM usage between GPU generations is very different. Especially considering GDDR5 would be the kind used for the minimum requirement VRAM, yet the 20 series and half of 30 series is GDDR6 and half of the 30 series to the newest cards are all GDDR6X. Not only is the newer VRAM more optimized, the card itself is, allowing it to use less VRAM for the same job that the 1080 might need 8GB of VRAM for.
The 1080 also just is better than a 2060. Not just in VRAM but overall specs and performance. The only downside is the 1080 would run hotter and use more VRAM in tasks that didn't require either card to hit their max. But the 1080 always had more potential for performance than the 2060. The 2070 was about even with the 1080 in performance. Side by side the 2070s advantage was using less VRAM than the 1080 for the same things and the 1080 used more power being older architecture. But when it came to VRAM it was more optimized and required less to do the same things. The gap between the 20 and 30 series for actual VRAM usage is there as well, and the gap between the 30 and 40 series is probably the biggest gap to date between series. 8GB on almost any newer card that only has 8GB is going to be just fine and be able to meet the recommended requirements specs and probably more just fine. The 40 series cards like with Gotham Knights probably won't even go over 9GB VRAM usage with everything maxed and raytracing. Optimization just makes VRAM requirements less and less of something to actually worry about for those with newer cards. The 4060 will run it just fine likely using like 6GB for recommended specs.
 
  • Like
Reactions: artk2219
This isn't entirely correct. The minimum VRAM amount is in reference to, in this case a 1080, the minimum requirements GPU.
It's entirely correct as you clearly haven't actually looked at the requirements and instead made an assumption. The game requirements split the GPU from the VRAM requirements and specifies the necessity of 8GB VRAM (it's the same for min and recommended).

Then you write a couple of paragraphs about the 1080 when it's the 580/1070 as minimum so unsure why you're talking about something that doesn't apply. The 1070 is about the same as the 1660 ti/super with the 2060 being a fair bit faster, and the 580 is slower than any of them. There are several nvidia cards closer to the 580 in performance, but they all have less than 8GB VRAM.
 

ottonis

Reputable
Jun 10, 2020
166
133
4,760
If the numbers presented in table 1 are accurate by any means, then the 4060Ti is going to have a higher TDP than its predecessor, the 3060Ti: 220W vs 200W.

Considering that the new Ada architecture and the twice as small process node (TSMC 4nm vs Samsung 8nm) provide a much higher efficiency, I wonder where this efficiency has suddenly gone in this card.

For me ( and I am not a typical gamer), the main reason to go for a 4060Ti instead of a 3060Ti would be the promise of potential lower power consumption at similar or slightly higher performance.
I guess that undervolting/underclocking this card might disproportionately decrease performance as nVidia designed a bottle neck with the 128 bit memory bus, that seems to be compensated by high clock speeds and higher L2 cash.

Well, let's wait for independent reviews before jumping to conclusions, but frankly, the first impressions from these leaks are rather disappointing,
 
Jul 7, 2022
596
558
1,760
This isn't entirely correct. The minimum VRAM amount is in reference to, in this case a 1080, the minimum requirements GPU. The 1080 as compared to now is an old architecture base. 8GB on it, is contrary to very popular belief not equivalent to 8GB on a newer card. Well especially not so considering it is GDDR5. This is a complete misconception. Newer cards are better at handling their VRAM and optimizing their processes to require less VRAM. Plus the recommended just like minimum are both 8GB. The difference being the recommended is a 20 series card which uses GDDR6. On any card newer than a 10-series chances are on minimum requirements, it isn't gonna hit 8GB of VRAM, newer cards are too optimized for that. Probably only going to hit that on settings where older cards can no longer run it smoothly anyways, or in the case of the new 40-series cards, while using ray tracing.

The recommended specs for Gotham Knights which is high settings on 1080p/60FPS is also 8GB of VRAM. On my 4080, playing it in 4K, with my render scale set to 200%, with ray tracing and everything set to the max, it uses 9GB at the most of VRAM usually a bit less. VRAM usage between GPU generations is very different. Especially considering GDDR5 would be the kind used for the minimum requirement VRAM, yet the 20 series and half of 30 series is GDDR6 and half of the 30 series to the newest cards are all GDDR6X. Not only is the newer VRAM more optimized, the card itself is, allowing it to use less VRAM for the same job that the 1080 might need 8GB of VRAM for.
The 1080 also just is better than a 2060. Not just in VRAM but overall specs and performance. The only downside is the 1080 would run hotter and use more VRAM in tasks that didn't require either card to hit their max. But the 1080 always had more potential for performance than the 2060. The 2070 was about even with the 1080 in performance. Side by side the 2070s advantage was using less VRAM than the 1080 for the same things and the 1080 used more power being older architecture. But when it came to VRAM it was more optimized and required less to do the same things. The gap between the 20 and 30 series for actual VRAM usage is there as well, and the gap between the 30 and 40 series is probably the biggest gap to date between series. 8GB on almost any newer card that only has 8GB is going to be just fine and be able to meet the recommended requirements specs and probably more just fine. The 40 series cards like with Gotham Knights probably won't even go over 9GB VRAM usage with everything maxed and raytracing. Optimization just makes VRAM requirements less and less of something to actually worry about for those with newer cards. The 4060 will run it just fine likely using like 6GB for recommended specs.
Just FYI, delta compression has nothing to do with gddr5 vs gddr6, if we could magically back port the latest Nvidia delta compression algorithm (and associated delta compression hardware changes) onto the gddr5 card, then both cards would use the same amount of memory.
 
  • Like
Reactions: artk2219
Jul 7, 2022
596
558
1,760
If the numbers presented in table 1 are accurate by any means, then the 4060Ti is going to have a higher TDP than its predecessor, the 3060Ti: 220W vs 200W.

Considering that the new Ada architecture and the twice as small process node (TSMC 4nm vs Samsung 8nm) provide a much higher efficiency, I wonder where this efficiency has suddenly gone in this card.

For me ( and I am not a typical gamer), the main reason to go for a 4060Ti instead of a 3060Ti would be the promise of potential lower power consumption at similar or slightly higher performance.
I guess that undervolting/underclocking this card might disproportionately decrease performance as nVidia designed a bottle neck with the 128 bit memory bus, that seems to be compensated by high clock speeds and higher L2 cash.

Well, let's wait for independent reviews before jumping to conclusions, but frankly, the first impressions from these leaks are rather disappointing,
Yeah 220 watts seems like too much for a 60 grade card
 
  • Like
Reactions: artk2219

Eximo

Titan
Ambassador
We need Intel's next gen GPUs sooner rather than later. Atleast their Arc GPU pricing was sensible.

2024 at the earliest it seems, Q2 date. So possibly Computex 2024 which is a typical launch time for new GPUs.

I have high hopes as well. Seems they are doubling the core count, so that would put it maybe in RTX4070 territory for the flagship. Assuming everything remains the same.

If they have some internal hardware flaw with Alchemist like is suspected, and they fix it, who knows.
 

sherhi

Distinguished
Apr 17, 2015
79
51
18,610
I've seen a few xbox series s's selling for sub $150 second hand recently, thats a really solid proposition at that price point on its own, and definitely throws it into the why not have both category as its less than half the msrp of even the 3060 ti.
220€ was lowest campaign I saw in my country for brand new series S, currently it's 290€ + new Jedi game which is around 70€ if bought separately, still even 500€ series X is great choice. With good timing on the market one can skip several generations of GPUs (by taking console instead of upgrading old GPU)
This is spot on, the sad part is when your old GPU dies and you have a CPU with no integrated graphics, then you have to cashout some 500 usd or more for something thats barely an upgrade on your 6 or 3 years old GPU. cause yes 10XX Nvidia cards still do the job and quite good considering the asking price for anything the past 3 years.
I see used 1060 super for 50-60€ in my area, some 90% uplift from my gtx 760. Games will always be bound by consoles, their market is too big to be ignored. Studios have started to utilize 8 cores in games as well, not just GPU. Currently if one has like 6 cores or less (majority does) combined with some up to 6-8 GB ram GPU then spending 500 for new GPU makes no sense when you have everything you need for those games in full package for the same price.
 
  • Like
Reactions: artk2219
8GB VRAM is still plenty for todays games. The vast majority of games dont need that much. Are there any games that even exist with 12GB system requirements, most are still 4-6GB and 8 for recommended. No, you wont play Hogwartz at 8K 480FPS Max settings RT on a 4060Ti. But a lot of these high VRAM games are because of poor optimization.
Yes, that's very true but that doesn't change the fact that it's a real trend. Remember that this has been going on for decades. I still remember my old Albatron GeForce FX-5200 with 128MB of DDR, my old XFX GeForce 6200 with 256MB of DDR, my old Palit GeForce 8500 GT with 1GB of DDR3, my old XFX Radeon HD 4870s with 1GB of GDDR5, my old Gigabyte HD 7970s with 3GB of GDDR5 each, my two Sapphire R9 Furies with 4GB of HBM each, my XFX RX 5700 XT with 8GB of GDDR6 and my RX 6800 XT with 16GB of GDDR6.

A lack of optimisation isn't the root problem, time is. Sure, bad optimisation makes it worse but so too does people not paying attention to what they're buying. You could have the best optimisation possible but this would still keep happening. My first PC was the father of all PCs, the IBM PC Model 5150 and it had 640kB base RAM. To use more RAM with a 32-bit CPU like a 386, we had to use memory managers like EMM386 because MS-DOS couldn't natively address more than 640kB.

Look at us now with our 100GB+ games and 16-32GB of RAM. All that was required was the passage of time.
Hogwarts runs on a PS4 with 8GB UNIFIED. Yes, its not the prettiest, but they made it work.
Yeah but consoles and PCs aren't the same. PC's are optimised for multi-purpose use while consoles are designed to do one thing and one thing only (unless you count being a Blu-Ray player) and that's gaming. Besides, if you use the same settings that are on the PS4 (the ones that aren't the prettiest), you can play it on PC with an 8GB card without issue. The issue that people are flipping out about is how much they paid for their 8GB cards, not the fact that 8GB cards exist.
The responsibility is on game developers to design their games properly.
Sure, but when the vast majority of your customer base are console gamers, that's all you really have to worry about. Unless a game is some marvellous spectacle that can't be played on console (like a new Crysis-type game), PC will always be an afterthought.
We shouldn't have to all be running 4090's because of poor game development.
Who said that we have to? All you need is a mid-range card with 10-12GB of VRAM on it and you'll have no problem. Who said anything about an RTX 4090? Nobody with even a 10GB card like an RX 6700 or RTX 3080 is complaining and nobody with a 12GB card like an RTX 3060 or RX 6700 XT is complaining either. We're also not hearing complaints from people who own an RX 6600/6600 XT/6650 XT because their 8GB cards didn't cost them much (comparatively speaking). For what they paid, they're perfectly happy to turn settings down to accommodate the 8GB.

The people whining are the nVidia fanboys who paid way too much for an 8GB card, specifically the RTX 3060 Ti, RTX 3070 and RTX 3070 Ti. Those people are having some serious buyer's remorse because, they feel (and quite rightly) that, for the prices they paid, they shouldn't have to lower anything.

I completely agree with them and nVidia should be panned and chided for releasing 8GB cards in that range. Having said that, the people who bought those cards mostly did so out of their own ignorance and hubris. They did no product research, they just assumed that they knew everything and bought whatever GeForce card they could afford without a second thought (like the good little sheep that they are). If they had done their research, they'd have asked some pretty serious questions before making such stupid purchases.

Nobody held a gun to their head and told them that they had to pay those exorbitant prices for cards with only 8GB. They chose to do this, even though many tech reviewers pointed out that 8GB would sooner or later be a limiting factor on these cards. Of course, they couldn't be sure just how much sooner or later but I knew that it would be sooner because of experiences that I had. My R9 Furies became limited by VRAM because they only have 4GB each. Sure, 4GB of HBM can do more than 4GB of GDDR5 but it's still only 4GB. The 4GB wasn't enough for a GPU as potent as the ATi Fiji but that's all it had so it became prematurely crippled.

I got my RX 5700 XT with 8GB and suddenly everything was perfect, for about six months. Then I got the harbinger of things to come known as Far Cry 6. Its HD texture pack required 11GB of VRAM to use and I remember being a bit annoyed by the fact that my relatively-new video card couldn't use it. Don't get me wrong, the game ran great and I had a blast with it but I stored that bit of information about 8GB not being enough in the back of my mind. I knew that I was going to need a card with more VRAM and soon.

When I saw a chance to get a reference RX 6800 XT for $500 less than the going rate at the time, I pulled the trigger after deliberating with myself for about a hour. I knew it was as good as I was going to get at the time and that I'd be able to mine back a good chunk of the cost anyway if paired with my RX 5700 XT.

If it had been GeForce cards that weren't short on VRAM and Radeon cards were, I would've broken my Radeon streak at nine cards and bought one. However, the RTX 3080 only had 10GB and cost WAY more than the RX 6800 XT. I couldn't understand why but I wasn't going to look a gift horse in the mouth. Sure, I wasn't going to get things that are (probably) pretty cool like DLSS and nVenc but I wasn't going to get any headaches either (and I don't care about RT, at least, not yet).

I saw what was coming though. I did my due diligence and read everything I could about what was to come. I tried warning people about it but they dismissed me as a fanboy and did what they did anyway. I feel bad for them but you can only lead a horse to water, you can't make him drink (or swim for that matter). One thing was certain though, I made damn sure that I got a card that could handle the increased VRAM needs. This course of action is the reason why I'm not one of the people whining about game optimisation. I knew it was coming and I was prepared. I wish more people had done the same but you can't fix stupid.
Any game in which the dev put in the time and effort for optimization should have no problem with 8GB now, and into the near future. If 16GB was the standard like a lot of people are claiming, and NVidia actually went with that, the next XX50 card would be $899 lol.
Stop that! Jensen might be reading this and we don't need you giving him any big ideas! ;)
It would be insane. a 60 tier card does not need 12GB. what they did on the 3060 was stupid. They even made a 2060 12GB. Stupid, and totally unnecessary. 12GB shouldn't start until XX70, and thats where we are at.
I too thought this to be completely nonsensical until someone pointed out to me that the RTX 3060 was made for miners, not gamers. Then it suddenly made all the sense in the world.
There is nothing wrong with 8GB at 60Ti, 60, and 50Ti. 50 at 6GB. VRAM isn't cheap and the cost is passed onto the consumer. You dont want more expensive GPUs do you?
Do you work for nVidia or something? I'm sorry to say this but that's one of the most ridiculous things that I've ever read. That sounds just dumb enough to be nVidia's company line. The amount of VRAM on a card has little to no influence on how much the card actually costs. There is no excuse for what nVidia has done so why are you trying to make one?

This whole issue is an nVidia thing, not a "VRAM thing" or a "developer thing" and here's the proof:

Radeon RX 6600 8GB - $200
Arc A750 8GB - $250
Radeon RX 6600 XT 8GB - $254
Radeon RX 6650 XT 8GB - $260
Radeon RX 6700 10GB - $260 <- *
GeForce RTX 3050 8GB - $280
Arc A770 8GB - $330
GeForce RTX 3060 12GB - $335 <- *
Radeon RX 6700 XT 12GB - $340 <- *
Arc A770 16GB - $350 <- *Didn't you say VRAM isn't cheap? :ROFLMAO:
GeForce RTX 3060 8GB - $350
Radeon RX 6750 XT 12GB - $380
GeForce RTX 3060 Ti 8GB - $385
GeForce RTX 3070 8GB - $461
Radeon RX 6800 16GB - $480
Radeon RX 6800 XT 16GB - $510
GeForce RTX 3070 Ti 8GB - $560
GeForce RTX 4070 12GB - $600
Radeon RX 6950 XT 16GB - $620
GeForce RTX 4070 Ti 12GB - $800
Radeon RX 7900 XT 20GB - $800
Radeon RX &900 XTX 24GB - $950
GeForce RTX 4080 16GB - $1,110
GeForce RTX 4090 24GB - $1,600

Maybe you can explain just what you're talking about because as we watch these prices rise, we can see that there's absolutely no correlation between price and the amount of VRAM on the card.
  • The 10GB RX 6700 costs less than the 8GB RTX 3050
  • The 12GB RTX 3060 costs less than the 8GB RTX 3060
  • The 12GB RX 6700 XT costs less than the 8GB RTX 3060
  • The 16GB A770 costs less than the 8GB RTX 3060
  • The 16GB A770 costs only $20 more than the 8GB A770
  • The 3060 Ti and 3070 cost more than 5 cards with >8GB
  • The 16GB RX 6800 costs less than the 8GB RTX 3070 Ti
  • The 16GB RX 6800 XT costs less than the 8GB RTX 3070 Ti
  • The 16GB RX 6970 XT costs less than the 8GB RTX 4070 Ti
  • The 20GB RX 7900 XT costs less than the 16GB RTX 4080
  • The 24GB RX 7900 XTX costs less than the 16GB RTX 4080
What we can see is that Intel only charges $20 more for a 16GB A770 than an 8GB A770 so it looks like 8GB of VRAM costs $20. I don't know about you but that certainly fits my definition of "cheap". We also see nVidia cards costing more than Arc and Radeon cards that completely outperform them. We also see that nVidia charges roughly the same amount for an RTX 3060 regardless of whether it has 8GB or 16GB. In fact, the cheapest 8GB model that I found was more expensive than the cheapest 12GB model!

I don't know how you would get the idea that nVidia would need to raise prices if they had put a few more gigabytes of VRAM because this list of cards from top to bottom says the exact opposite. Note that I didn't cherry-pick these. These are all the least-expensive models that I could find on pcpartpicker. Anyone can vet my list by just clicking on the link that I left ^^^ there.
If games are properly optimized, even the latest games would run fine on 8GB VRAM at 1080p max settings, RT, 120FPS, with the help of DLSS. Your 60Ti's would work just fine and we can all have semi-affordable GPUs. Its optimization that's the problem
Yeah but then the games themselves would cost far more than what they do because the devs would have to dedicate a crap-tonne of extra man-hours to said optimisation for a segment of the gaming market that is quite small compared to the console crowd.

Added labour would cause a significant increase in the price of games, far more than if nVidia just put enough VRAM on their mid-range cards so that badly optimised games still worked. You know, like AMD and Intel did. I don't understand how you can blame the game developers when your idea would end up costing us even more because labour hours cost A CRAP-TONNE MORE than a few extra gigabytes of GDDR6 or GDDR6X.
No you don't get it, if you have something in your PC case and can play some type of games already with that (like grand strategies that don't need much GPU power, most online shooters, eSports, MMOs,...) and want to play some new AAA games then getting 500€ GPU or whole new console for that is a no brainer, sadly most PC users are not not able to think outside their box. Upgrading to 500€ new GPU with 8 gigs of ram just does not make much sense, no matter how PCMR fanboys try to spin it the consoles and GPUs do affect each other.
I couldn't have said it better myself.
I would be amazed if it was cheaper than $499 given Nvidia's current pricing stategy. Looks like the core clock boost will allow it to outperform the 3060Ti despite the drop in CUDA cores etc and the massive bandwidth drop. I suspect that it will be a narrow win of <20% at 1440p tho.
Maybe, but that's hardly worth talking about and definitely not worth the increase in price.
Not really the xx50 is $349 at least, you might get one below $300 on offer or after it's been out >9months but I would be amazed if the 4050 was below $350 (nvidia might even go $449 for the 4060 and $399 for the 4050 and blame inflation)
Yeah, the same inflation that affects video cards but not motherboards or CPUs even though they're all made from the same parts. Funny how that works, eh? ;)
I agree but the fact is nVidia don't need gamers, they have loads of AI business at massively larger margins, they consider they are doing gamers a favour selling them GPUs at less than $2k. They will continue like this and maybe get even worse as as much as we hate it their job is to maximise profit and the best way to do that is to sell a AI card at $50k not a GPU at $500.
Yeah, but that's true about all three corporations.
Could be said with how cut down the 4060 is that it is a xx50 class card.
Yeah, this is definitely a generation to skip.
This is spot on, the sad part is when your old GPU dies and you have a CPU with no integrated graphics, then you have to cashout some 500 usd or more for something thats barely an upgrade on your 6 or 3 years old GPU. cause yes 10XX Nvidia cards still do the job and quite good considering the asking price for anything the past 3 years.
Well, this is only the case if you're unwilling to get a Radeon or Arc card. People who are stubborn and inflexible are their own worst enemies. When I hear someone crying about nVidia's pricing, I just say "So get a Radeon or Arc card! It's not that hard a concept to grasp!". :giggle:
 
Last edited:

Ogotai

Reputable
Feb 2, 2021
327
221
5,060
Works other way around as well...and why not both, upgrade GPU for this piece of...something...or get a console, buy some HDMI switch and play whatever you want.
why would you need an hdmi switch ? play comp games on the monitor, and play the console games on the tv. also most monitors and tvs have more then one hdmi port, and a display port
and want to play some new AAA games then getting 500€ GPU or whole new console for that is a no brainer, sadly most PC users are not not able to think outside their box. Upgrading to 500€ new GPU with 8 gigs of ram just does not make much sense, no matter how PCMR fanboys try to spin it the consoles and GPUs do affect each other.
maybe for you, but i sure cant justify a $600+ console, and only have 1-3 games on it that i would want to play, when i can spend that 600 + the cost of the games, and use that money to ( if needed ) upgrade my comp, which right now, only needs a vid card. sadly, most console users are not able to think out side their box as well, that works both ways. for me to spend money on a console, is a complete waste of money. it would just sit and collect dust. that money would be better spent on a new gpu, even if the gpu cost more, it would get more usage then the console ever would, for the simple fact, there are no games on them that i want to play. so i will make you a deal, you give the the money for a console, tell me which one, and i will go buy it, then send you a pic each month so you can see the dust collecting on it, and i will use my own money, and upgrade my vid card, and use it daily.

PS5 + star wars Jedi survivor alone are $764+ tax = $856 , sold separately @ bestbuy here, the same prices at gamestop. there are no bundles with this, at bestbuy or gamestop. too bad this is a game i dont want to play.

for $764, i could get an Asus KO Geforce RTX 3060 TI v2 OC edition, yes it is an 8 gig card, ( which should be ok for the games i play ) and still have $20 bucks in my pocket, before tax. or i can get any rtx 3060, with 12 gigs.

but that is why i am still running a 1060, vid card prices are still too high here. the games i play are still running just fine on this card, even at a res of 1440. if and when they drop farther, or i catch a sale on one. i can wait. waited 2 years so far cause of covid, so whats a little longer ?
 
May 4, 2023
3
2
15
It's entirely correct as you clearly haven't actually looked at the requirements and instead made an assumption. The game requirements split the GPU from the VRAM requirements and specifies the necessity of 8GB VRAM (it's the same for min and recommended).

Then you write a couple of paragraphs about the 1080 when it's the 580/1070 as minimum so unsure why you're talking about something that doesn't apply. The 1070 is about the same as the 1660 ti/super with the 2060 being a fair bit faster, and the 580 is slower than any of them. There are several nvidia cards closer to the 580 in performance, but they all have less than 8GB VRAM.
1070 then, I've read about 4 different sources saying 1080 but I see EA say 1070 so sure. But no. 8GB of VRAM on a 10 series card is entirely different from 8GB on a 20, 30 or 40 series. Jedi Survivor is only going to use up 8GB of VRAM on a 10 series while playing minimum settings. Not on any more powerful card. When they reference VRAM in spec recommendations, it is always has and always will been in reference to the generation of the recommended card. In fact if you go and look at the recommendation on numerous games it is always equal to the VRAM that said suggested card possesses. Except in cases where the card has different VRAM versions and it might say the higher of the two models. You never see them go "Oh 1070 is the minimum" and then put 16GB minimum VRAM, it doesnt exist. It is always a VRAM requirement of the specified model card. They split it up so people with less computer knowledge know the VRAM recommendation. A 2060 is the only card beyond the 10 series that only has 6GB RAM. Every other card has at least 8GB. But it is pure fact that on newer cards they have far better compression and won't use the 8GB the way the 1070 will and in the past few days Nvidia unveiled new compression tech that is slated to drastically reduce VRAM usage which shows that yes, newer products have better compression and require less VRAM.

The whole point is 8GB VRAM is only applicable for minimum specs because the recommended card is a 10 series. Every single series after has had better and better compression allowing VRAM to be utilized better. The 4060 won't have any issues. It use's AD106 architecture and has GDDR6X VRAM. Both are just better for minimizing VRAM usage than a 1070. Literally the same idea as argued if it was a 1080. Newer generation VRAM is NOT equivalent to older VRAM. It has far better compression. The 4060 is going to be completely fine with 8GB of VRAM.
 
Jul 7, 2022
596
558
1,760
1070 then, I've read about 4 different sources saying 1080 but I see EA say 1070 so sure. But no. 8GB of VRAM on a 10 series card is entirely different from 8GB on a 20, 30 or 40 series. Jedi Survivor is only going to use up 8GB of VRAM on a 10 series while playing minimum settings. Not on any more powerful card. When they reference VRAM in spec recommendations, it is always has and always will been in reference to the generation of the recommended card. In fact if you go and look at the recommendation on numerous games it is always equal to the VRAM that said suggested card possesses. Except in cases where the card has different VRAM versions and it might say the higher of the two models. You never see them go "Oh 1070 is the minimum" and then put 16GB minimum VRAM, it doesnt exist. It is always a VRAM requirement of the specified model card. They split it up so people with less computer knowledge know the VRAM recommendation. A 2060 is the only card beyond the 10 series that only has 6GB RAM. Every other card has at least 8GB. But it is pure fact that on newer cards they have far better compression and won't use the 8GB the way the 1070 will and in the past few days Nvidia unveiled new compression tech that is slated to drastically reduce VRAM usage which shows that yes, newer products have better compression and require less VRAM.

The whole point is 8GB VRAM is only applicable for minimum specs because the recommended card is a 10 series. Every single series after has had better and better compression allowing VRAM to be utilized better. The 4060 won't have any issues. It use's AD106 architecture and has GDDR6X VRAM. Both are just better for minimizing VRAM usage than a 1070. Literally the same idea as argued if it was a 1080. Newer generation VRAM is NOT equivalent to older VRAM. It has far better compression. The 4060 is going to be completely fine with 8GB of VRAM.
First, compression hasn’t come that far. Single digit/maybe low 2 digit percentage at max from 10 series to now so 8 GB is still near as makes no difference 8GB. The biggest jump in compression was for the 10 series when Nvidia switched to delta compression, needless to say, there is not a lot of improvements to be had that’s worth the high R&D necessary to extract such gains. The more cost effective way is to simply adopt a new compression algorithm, but for now we are still stuck on delta compression.

2nd, vram memory type has nothing to do with compression or “minimizing VRAM usage”. All ram types are dumb, IE it takes a controller to make them work (ram is like a hotel IE an inanimate object, it requires a manager conducting business in order for the hotel to be of any use.)
 
  • Like
Reactions: thestryker
1070 then, I've read about 4 different sources saying 1080 but I see EA say 1070 so sure. But no. 8GB of VRAM on a 10 series card is entirely different from 8GB on a 20, 30 or 40 series. Jedi Survivor is only going to use up 8GB of VRAM on a 10 series while playing minimum settings. Not on any more powerful card. When they reference VRAM in spec recommendations, it is always has and always will been in reference to the generation of the recommended card. In fact if you go and look at the recommendation on numerous games it is always equal to the VRAM that said suggested card possesses. Except in cases where the card has different VRAM versions and it might say the higher of the two models. You never see them go "Oh 1070 is the minimum" and then put 16GB minimum VRAM, it doesnt exist. It is always a VRAM requirement of the specified model card. They split it up so people with less computer knowledge know the VRAM recommendation. A 2060 is the only card beyond the 10 series that only has 6GB RAM. Every other card has at least 8GB. But it is pure fact that on newer cards they have far better compression and won't use the 8GB the way the 1070 will and in the past few days Nvidia unveiled new compression tech that is slated to drastically reduce VRAM usage which shows that yes, newer products have better compression and require less VRAM.

The whole point is 8GB VRAM is only applicable for minimum specs because the recommended card is a 10 series. Every single series after has had better and better compression allowing VRAM to be utilized better. The 4060 won't have any issues. It use's AD106 architecture and has GDDR6X VRAM. Both are just better for minimizing VRAM usage than a 1070. Literally the same idea as argued if it was a 1080. Newer generation VRAM is NOT equivalent to older VRAM. It has far better compression. The 4060 is going to be completely fine with 8GB of VRAM.
As has already been said compression does not work like that. There haven't been any revolutions in compressions for an extremely long time and if I'm remembering right it was AMD's FuryX that had the biggest gen over gen bump, but this didn't help it against the 980ti.

You also keep ignoring the RX580 side of the requirements and it and the GTX 1060 6GB trade blows depending on title and the GTX 980ti is superior pretty much across the board, yet the GTX 1070 is the minimum despite being 15-25% faster. The only logical conclusion here is that they're saying, as stated, the minimum VRAM requirement is 8GB.

Now maybe in the future there will be some sort of compression breakthrough that doesn't require coding software to take advantage, but that isn't now. Game VRAM requirements are so bad largely because there's no impetus to do better. This will be a problem going forward that is just going to get worse.