News Nvidia GeForce RTX 4060 Alleged Launch Date Revealed

The GeForce RTX 4060 is expected to use the AD106 GPU, which comes with 3072 CUDA cores,

No, the card is expected to utilize the AD107-400-A1 GPU, the fully enabled AD107 Ada die. Same configuration is also used by the GeForce RTX 4060 Laptop GPU. You need to correct your table as well.
 
BTW, regarding the bandwidth, based on specs the card outputs "272 GB/s" bandwidth which is effectively rated at 453 GB/s, which appears to be 26% more than the RTX 3060, thanks to the new 24 MB of L2 cache.

Curious to see how this card performs compared to previous gen SKUs on 1080p gaming benchmarks. But the Memory bus width is still limited to 128-bit, despite the higher effective bandwidth.
 
I'am waiting for a low power card... Something like this gpu, but I will wait the downfall of the Rx 7600
My LG projector only do 60 hz and the Monitor LG have free sync 75hz...
got an 12700T for make a gaming build with power less than 130w for entire system or unlimited power 200w max.

I don't want 400 fps only want 1W/FPS or less
 
  • Like
Reactions: IamNotChatGpt
Barely any time to prepare the pitchforks!

I have the torches ready though.

Regards XD
Given what I've read about the 'new GPUs/consumer base' I don't think NVIDIA cares if anyone buys them.
Their money is being made on 'Al'.
If memory serves me correctly, NVIDIA plans to spin off the consumer driven GPU's to Intel.
I read this somewhere, but don't recall where.
 
Given what I've read about the 'new GPUs/consumer base' I don't think NVIDIA cares if anyone buys them.
Their money is being made on 'Al'.
If memory serves me correctly, NVIDIA plans to spin off the consumer driven GPU's to Intel.
I read this somewhere, but don't recall where.
I doubt nVidia would ever do that, only because they're the ultimate "I want the whole cake and I'll eat it too".

Still, it's an interesting thought how that would work in practicality.

Regards.
 
Forgive my naive question: Since 16GB looks like a sweet spot for video memory on 4K games, why is only the top and the 4060 available with 16GB? Could someone please elaborate or explain if that premise is wrong? Thanks :)
 
Forgive my naive question: Since 16GB looks like a sweet spot for video memory on 4K games, why is only the top and the 4060 available with 16GB? Could someone please elaborate or explain if that premise is wrong? Thanks :)

Bus width basically determines how many memory chips you are going to have. 1 chip per 32bits essentially.

4060 and 4060 Ti have a 128bit bus so it is either 8GB or 16GB, nothing in between without sacrificing a memory channel and making the card too slow to need that much VRAM.

Doubling the memory capacity would turn 12GB cards into 24GB cards, which rivals the high end cards. Not all uses need a powerful GPU, just lots of VRAM, so they would basically be removing the need for people to buy the higher end and professional grade cards.

That they are making a 16GB 4060 Ti is a bit shocking, but similar happened with the 3060 12GB while the 3060 Ti, 3070, and 3070Ti all had 8GB.
 
  • Like
Reactions: sundragon
Who cares? 6700XT cheaper, way faster, 12gb VRAM
And it's more expensive, has no DLSS (no fsr is not the same), less experienced tensor & rt cores (compare gens), no proper raytracing, no vsr, amd drivers and an inferior software stack.

No thanks, skip.
 
Last edited:
Bus width basically determines how many memory chips you are going to have. 1 chip per 32bits essentially.

4060 and 4060 Ti have a 128bit bus so it is either 8GB or 16GB, nothing in between without sacrificing a memory channel and making the card too slow to need that much VRAM.

Doubling the memory capacity would turn 12GB cards into 24GB cards, which rivals the high end cards. Not all uses need a powerful GPU, just lots of VRAM, so they would basically be removing the need for people to buy the higher end and professional grade cards.

That they are making a 16GB 4060 Ti is a bit shocking, but similar happened with the 3060 12GB while the 3060 Ti, 3070, and 3070Ti all had 8GB.
Thank you for that! I missed the bus width in the equation. I was shocked to see the 16GB 4060 but didn't know the history. I would buy a 16GB card if it performs well enough in 4K to last a few years like my 1070 for FHD gaming.
 
Thank you for that! I missed the bus width in the equation. I was shocked to see the 16GB 4060 but didn't know the history. I would buy a 16GB card if it performs well enough in 4K to last a few years like my 1070 for FHD gaming.
Nvidia bsed us with excuses such as L2? cache being much bigger therefore cutting memory bus width and whatnot would achieve the same performance.

No, they just didn't wanted to lose any profit margin.

————————

When a company squeezes every last penny out of its customers while increasing profits, it means they don't even see said customers as recipients of innovation but rather piñata of profit.
 
Last edited:
I doubt nVidia would ever do that, only because they're the ultimate "I want the whole cake and I'll eat it too".

Still, it's an interesting thought how that would work in practicality.

Regards.
Here is the article:

Intel-manufactured Nvidia GPUs could be coming soon​

The tie-up with Nvidia comes at a critical time for Intel's beleaguered foundry business
By Kishalaya Kundu May 31, 2023 at 8:05 AM
 
Here is the article:

Intel-manufactured Nvidia GPUs could be coming soon​

The tie-up with Nvidia comes at a critical time for Intel's beleaguered foundry business
By Kishalaya Kundu May 31, 2023 at 8:05 AM

Using Intel's foundries and spinning off the consumer GPU business are two *very* different things. It's just outsourcing the fabrication to a party other than TSMC and Samsung. Nvidia is one of the world's largest fabless semiconductor companies.
 
  • Like
Reactions: -Fran-