Nvidia plans to radically lower power consumption of new-generation performance-mainstream part, a leaker says.
Leaker Suggests RTX 4060 Ti Has 160W TGP : Read more
Leaker Suggests RTX 4060 Ti Has 160W TGP : Read more
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
That's like... business 101 no?Nvidia makes stuff like this because they know there are suckers out there that will buy it.
how is a TDP of 160W on a card that probably draws over 250W of power "power efficient"?
what universe is this? I remember the nvidia 970 and 1060 both drew low enough power you could almost get away with passive cooling them as long as you weren't overclocking them. but here we are talking about a power draw along the lines of a 80 series gpu just 2 generations ago as "power efficient"
how is a TDP of 160W on a card that probably draws over 250W of power "power efficient"? ...
Compared to higher-tier GPUs, it is quite low.
160W TDP for a xx60 series card is not low at all.
That's around what GTX 1080 used which had a TDP of 180W.
The amount of power GPU are using the last 2 generations is out of control.
And Nvidia and AMD dare to call their new GPU efficient.
As an arch-to-arch comparison: the 4070Ti has effectively half the VRAM buswidth, capacity, and bandwidth, compared to the 3080. It performs basically the same or better. Ada is clearly a dramatically less memory bandwidth constrained architecture than Ampere for client workloads.Meanwhile, the new product will also feature a 36% lower memory bandwidth than its direct predecessor, the GeForce RTX 3060 Ti with GDDR6 memory. Of course, we would expect 32MB of L2 cache to somewhat reduce bandwidth limitations, but only tests will reveal whether this is the case.
Wait. What? A 1070 at that core clock?All run in the range of 1984-2025
Wait. What? A 1070 at that core clock?
Nvidia makes stuff like this because they know there are suckers out there that will buy it.
lol yep
Just bought a lightbulb the other day. It uses 7 watt like any other lightbulb nowadays. My sister reminded me to turn them off when I'm not in my room to save energy.
Meanwhile this GPU will use 200+ watt and is called "efficient". Just lol, it would literally be the item using most electricity in our household. Yes our washing machine can use more, but it is only turned on for like 40 minutes twice a week. These companies live in an alternate universe where electricity is free and they haven't been paying attention to anything going on in the world.
My EVGA GTX 1080 SC boosted to 2012 Mhz default (Reference board). I managed to get that to about 2100Mhz.
Pretty common for boost numbers in the mid 1900s on lots of SKUs.
No way! Wow. That is impressive.
how is a TDP of 160W on a card that probably draws over 250W of power "power efficient"?