News Alleged Launch Dates for Nvidia RTX 4060 Ti and RTX 4050 Leak

D

Deleted member 2731765

Guest
By contrast, the GeForce RTX 4060 is rumored to feature a cut-down version of the AD106 GPU (presumably with 3072 CUDA cores) and 8GB of memory on a 128-bit bus.

But the chances are high this GPU might sport the AD107-400-A1 GPU instead, the fully enabled AD107 Ada die.

Do note that the same configuration is used by the GeForce RTX 4060 Laptop GPU, so it makes some sense to use the same die for the desktop variant as well, BUT we can't completely rule out the AD106 die just yet.

The RTX 4060 Laptop card also offers 3072 shaders and uses 8 GB GDDR6 dedicated graphics memory with a clock speed of 16 Gbps (effective) and a 128 Bit memory bus. So the desktop variant should sport similar specs, unless NVIDIA has some other plans for this mainstream desktop SKU.
 
  • Like
Reactions: artk2219

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
The 3050 has 8GB VRAM, so one assumes the 4050 will have at least 8GB too, but you never know with Nvidia.

The other question is if that 4050 will be sub 75 watt or will it require an extra 8 pin. The 4060 is apparently 115 watt, so it is possible the 4050 is sub 75 watt.
 
Last edited:
  • Like
Reactions: artk2219

Giroro

Splendid
I have my doubts that there will be an RTX 4050 this year, or possibly ever since their clear "upscale" in card names/prices. That would mean their xx30 design would have been rebranded to the RTX 4050, which tends to come years late and has been missing the last 2 geneations. Maybe they port their mobile design to desktop and call it an RTX 4050, but Nvidia seems to have little motivation to undercut their high-end cards with a low-end product. But maybe the margins are too tempting to try slapping an oversized heatsink on a $100-class GPU and selling it for ~$400

More likely, I could see Nvidia's RTX 4060 Ti design being released as an RTX 4070 variant in May and their RTX 4050 design being sold as an RTX 4060 in June.
 
D

Deleted member 2731765

Guest
This sucks if true. Wtf !

The same leaker has also confirmed the memory specs of the RTX 4050. He claims the card will come out in a 6GB VRAM flavor, so that means a 96-bit memory bus width interface, and most likely sporting 2560 CUDA cores ?

A HUGE downgrade than the RTX 3050 which had at least 128-bit memory bus and 8GB VRAM. But I'm just assuming the 4050 sports a 96-bit memory bus. Maybe Nvidia has other plans for this SKU.

But jeez, can a 96-bit wide memory interface GPU be worth the purchase in 2023 ? Though, the RTX 4050
might still end up being faster than the RTX 3050 despite this narrower/smaller memory, since it is based on the new ADA architecture. But this is just a speculation for now.

But a 96-bit bus will still be a huge limiting factor for most of the modern games, if not all. Main question, is how much Nvidia will charge for this RTX 4050-class GPU. My guess is even more than the RTX 3050 given the latest pricing trend of ADA lineup.


View: https://twitter.com/Zed__Wang/status/1641057444548976643
 
Last edited by a moderator:
D

Deleted member 2731765

Guest
BTW, this past week both "The Last of Us Part 1" and "Resident Evil 4 (Remake)" games arrived on the PC, and both of these games can utilize a lot of graphics memory, even at low 1080p resolution.

With 8GB quickly becoming the minimum amount of memory that is advisable for 1080p gaming, Nvidia's RTX 4050 will be poorly positioned within today's PC market. This is especially true knowing that Nvidia's RTX 3050 has 8GB of graphics memory.

As far back as Nvidia's RTX 20 series, 8GB of graphics memory was not enough for modern games, and since then memory requirements for modern PC games have only gotten higher. Like the RTX 2060 6GB, we can expect a 6GB RTX 4050 to age poorly, as GPU memory requirements aren't going to get any lower.

If Nvidia releases an RTX 4050 with 6GB of memory, Nvidia will be releasing a compromised product. But I digress.
 

InvalidError

Titan
Moderator
All of the 4000 series cards have looked horrible, so far. Nvidia doesn't seem to care.
Jensen spelled it out in Nvidia's conference: Nvidia is a "we design hardware to run software to improve the world" company, not a gaming technology company. Nvidia is likely perfectly fine with pricing itself out of the gaming market if that translates to more spare silicon for AI, DC, HPC, etc. which it can sell at 10X the gross income per wafer.

Now, we have to cross our fingers that AMD's lacklustre sales on RX7xxx' inflated prices combined with far less impressive share of HPC, AI, etc. sales are enough to convince it to make more reasonable pricing decisions on its lower-end GPUs to remain competitive with what Intel may be coming up next. Recent rumors say BMG may actually launch with something more than twice as fast as the A770. Hopefully Intel will have learned its lessons from the A7xx and aim for market share and brand recognition in the GPU space instead of trying to be greedy and get ignored by most of the market.

Main question, is how much Nvidia will charge for this RTX 4050-class GPU. My guess is even more than the RTX 3050 given the latest pricing trend of ADA lineup.
If there is a 6GB/96b 4050, then there will almost certainly be a 8GB/128b 4050Ti later.

I certainly hope the 6GB/96b version, if that is really Nvidia's plan, won't cost more than $250. I wouldn't buy one for more than $180.

As for a 6GB GPU being "compromised", nobody buys this tier of graphics without expecting significant compromises in modern-day titles and almost certainly even more compromises in the near future. I'm still using a GTX1050 and getting most of the newer stuff to run remotely acceptably means turning everything down as low as it goes.
 

JamesJones44

Reputable
Jan 22, 2021
796
725
5,760
Jensen spelled it out in Nvidia's conference: Nvidia is a "we design hardware to run software to improve the world" company, not a gaming technology company. Nvidia is likely perfectly fine with pricing itself out of the gaming market if that translates to more spare silicon for AI, DC, HPC, etc. which it can sell at 10X the gross income per wafer.

Exactly this. Nvidia is transitioning from a "gaming" GPU company to an AI/ML general compute company. Gaming as a share of revenue and profit will continue to shrink so they won't be as interested in keeping that market happy as the will be for the ML training market.
 
D

Deleted member 2731765

Guest
As for a 6GB GPU being "compromised", nobody buys this tier of graphics without expecting significant compromises in modern-day titles and almost certainly even more compromises in the near future.

Yeah, I can definitely agree on this point. Anyone buying a 50-class GPU won't be having very high expectations in the first place. Also, they won't be rocking a 2K/4K monitor to begin with.

FWIW, I'm also using an AMD RX 480 4GB card, and it still does the job done on 1080p. But 4GB VRAM is still sufficient for older PC games, which are less graphic demanding, like indie games, and other titles like 2D platformers, side-scrollers, pixel graphics, racing/fighting etc.

I mostly play older FPS titles. The 480 can even play modern titles, on medium/high settings, depending on the game's engine, but the VRAM limitation shows up pretty fast.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
BTW, this past week both "The Last of Us Part 1" and "Resident Evil 4 (Remake)" games arrived on the PC, and both of these games can utilize a lot of graphics memory, even at low 1080p resolution.

Yep, it literally throws out an error if you lack VRAM and won't even boot the game.

Only 6GB would be crazy.

maxresdefault.jpg
 
Maybe they think they are getting to the place where gpus are good enough. Also a good way to get people on something of a subscription. In other words make cards just good enough to do the job, unless you pay big money for the higher end cards. Either way at that point they are getting your money either every 2-5 years. It used to be gpus would be leaps and bounds. Now they seem to refine them more. Good way to keep making money though.
 
D

Deleted member 2731765

Guest
Yep, it literally throws out an error if you lack VRAM and won't even boot the game.

Only 6GB would be crazy.

There are several reports claiming that this game also requires a LOT of system RAM/memory as well. Some users have reported 15GB+ RAM usage while running this game. Even 24GB memory usage if you SKIP the shader compilation process, and launch the without completing the shader compilation procedure.

I guess the game is badly optimized, or there is a memory leak somewhere.
 
There are several reports claiming that this game also requires a LOT of system RAM/memory as well. Some users have reported 15GB+ RAM usage while running this game. Even 24GB memory usage if you SKIP the shader compilation process, and launch the without completing the shader compilation procedure.

I guess the game is badly optimized, or there is a memory leak somewhere.

Too much brute force with this port. Shader compiling which can takes a lot of time already a red flag.
 

Pirx73

Distinguished
Apr 11, 2015
20
8
18,515
What about AMD? nVIDIA at least is close to launching lower cards in 4xxx series while AMD is doing what exactly?
We do not have anything solid on 7800, 7700 and 7600 series. Please correct me if i am wrong.
 
  • Like
Reactions: KyaraM
D

Deleted member 2731765

Guest
Too much brute force with this port. Shader compiling which can takes a lot of time already a red flag.

The most weird and funny part regarding the "Shader Compilation" process is that it can even take almost an hour, 1 hour minimum, to complete on some PCs with lower hardware. It's indeed a huge red flag.
 
What about AMD? nVIDIA at least is close to launching lower cards in 4xxx series while AMD is doing what exactly?
We do not have anything solid on 7800, 7700 and 7600 series. Please correct me if i am wrong.

lol late last year they say AMD will gunning for market share and will release their low end much earlier than nvidia. the cause: nvidia still have tons of 30 series to be sold while AMD have so such issue. combine with MCM (which they believe can makes things significantly cheaper on AMD side) and they will kick nvidia in the nuts and there is nothing nvidia can do about it.

initially nvidia was supposed to launch 4070/4060Ti/4060 in april. but now we heard the x60 part might not coming until may or even june (together with the new x50). nvidia probably pushing the launch dates because AMD 7800 and lower probably not coming out until late Q2 this year.
 
D

Deleted member 2731765

Guest
The most weird and funny part regarding the "Shader Compilation" process is that it can even take almost an hour, 1 hour minimum, to complete on some PCs with lower hardware. It's indeed a huge red flag.

I would like to add this part as well. It appears that the The Last of Us Part 1's game's optimization issues are because it was outsourced and developed with the help from the Iron Galaxy studio.

Iron Galaxy has dropped the ball numerous times when it comes to its PC ports. I mean, who can forget Batman: Arkham Knight, right? Moreover, UNCHARTED: Legacy of Thieves Collection had major mouse control issues when it came out. Hilariously, a post-launch patch also introduced some awful camera stutters.
 
D

Deleted member 2731765

Guest
I hope Intel sorts out its drivers soon so I can get an A750 without worrying too much about weird frame time jitters.

I think by the time INTEL sorts out all it's driver issues, the next-gen Battlemage Xe2-HPG GPU architecture might already be out.
 

InvalidError

Titan
Moderator
lol late last year they say AMD will gunning for market share and will release their low end much earlier than nvidia. the cause: nvidia still have tons of 30 series to be sold while AMD have so such issue. combine with MCM (which they believe can makes things significantly cheaper on AMD side) and they will kick nvidia in the nuts and there is nothing nvidia can do about it.
Except there is no "kicking Nvidia in the nuts" on manufacturing cost of lower-end parts since the RX7700 and lower are still monolithic chips. 200sqmm appears to be the cut-off point where MCM adds more cost and complexity than the improved yields are worth.
 
The same leaker has also confirmed the memory specs of the RTX 4050. He claims the card will come out in a 6GB VRAM flavor
Perhaps they are just taking a page from the car industry. Card ships with 8GB installed but the last 2GB are only unlocked with a subscription or single large fee when you discover it's needed
What about AMD? nVIDIA at least is close to launching lower cards in 4xxx series while AMD is doing what exactly?
Given the recent trend over there, undoubtedly they are working on finishing up the new RX7600 design with its PCIe 5.0 x2 interface
 

TRENDING THREADS