News Leaker Suggests RTX 4060 Ti Has 160W TGP

how is a TDP of 160W on a card that probably draws over 250W of power "power efficient"?

what universe is this? I remember the nvidia 970 and 1060 both drew low enough power you could almost get away with passive cooling them as long as you weren't overclocking them. but here we are talking about a power draw along the lines of a 80 series gpu just 2 generations ago as "power efficient"
 
  • Like
Reactions: SSGBryan and PEnns
Way too high. A 1060 consumes far less and that is still one of the most used cards on Steam.

With electricity prices being so high in many places around the world, this is a power hungry GPU and CPU generation I will just skip.

Hopefully ARM becomes more mainstream for PC like it did for Apple, at least they have power correctly under control.
 
how is a TDP of 160W on a card that probably draws over 250W of power "power efficient"?

what universe is this? I remember the nvidia 970 and 1060 both drew low enough power you could almost get away with passive cooling them as long as you weren't overclocking them. but here we are talking about a power draw along the lines of a 80 series gpu just 2 generations ago as "power efficient"

lol yep

Just bought a lightbulb the other day. It uses 7 watt like any other lightbulb nowadays. My sister reminded me to turn them off when I'm not in my room to save energy.

Meanwhile this GPU will use 200+ watt and is called "efficient". Just lol, it would literally be the item using most electricity in our household. Yes our washing machine can use more, but it is only turned on for like 40 minutes twice a week. These companies live in an alternate universe where electricity is free and they haven't been paying attention to anything going on in the world.
 
Last edited:
  • Like
Reactions: SSGBryan and PEnns
If it will be performance-wise a solid fit for 1440p gaming, then it may become quite popular with such power consumption and size, if reasonably priced. But yeah, I prefer having 12 GB, which is so far more of a "just in case" thing than a necessity though.

how is a TDP of 160W on a card that probably draws over 250W of power "power efficient"? ...

Compared to higher-tier GPUs, it is quite low. And sure, it may be too much for some. Personally, having a GPU with around 200W (at peak), that's some 130W more than I had with 1050 Ti, which didn't really carry a lot of newer games, let alone at 1440p. And with an average of perhaps 10 hours of gaming a week, that's less than 2 kWh extra, or less than 10 kWh a month.
 
Last edited:
Compared to higher-tier GPUs, it is quite low.

160W TDP for a xx60 series card is not low at all.

That's around what GTX 1080 used which had a TDP of 180W.

The amount of power GPU are using the last 2 generations is out of control.

GTX 1070 = TDP 150W.
RTX 4070 = TDP 285W

GTX 1080 = TDP 180W.
RTX 4080 = TDP 320W.

And Nvidia and AMD dare to call their new GPU efficient.
 
Last edited:
160W TDP for a xx60 series card is not low at all.

That's around what GTX 1080 used which had a TDP of 180W.

The amount of power GPU are using the last 2 generations is out of control.

And Nvidia and AMD dare to call their new GPU efficient.

GTX 1070 = TDP 150W @ 6.42 TFLOPS = 428 GFLOPS/Watt
RTX 4070 Ti = TDP 285W @ 40.1 TFLOPS = 1407GFLOPS/Watt

GTX 1080 = TDP 180W @ 8.87 TFLOPS = 500 GFLOPS/Watt
RTX 4080 = TDP 320W @ 48.8 TFLOPS = 1500 GFLOPS/Watt

So about 3 times as efficient overall. You could always set a power limit.
 
That is a little off.
Both of my GTX 1070s Asus Dual White, EVGA SC, pull between 200-220 watts with power limits set to 110%
My RTX 3060ti also pulls 200-220 watts with a 110% power limit.
All run in the range of 1984-2025mhz while loaded 95-100% load depending on folding work units.
So the above wattage's seem very low.
Will have to check the 1060 6gb Gamer (around 2000mhz +/-) and 960 4gig FTW (14?? mhz.)
 
Aftermarket cards have different power profiles. And you are increasing it beyond even that setting.

FE 1070 was 150W with a single 8-pin connector. GTX 1080 was 180W with a single 8-pin connector.

Partner boards with dual 8-pin were a thing.
 
Meanwhile, the new product will also feature a 36% lower memory bandwidth than its direct predecessor, the GeForce RTX 3060 Ti with GDDR6 memory. Of course, we would expect 32MB of L2 cache to somewhat reduce bandwidth limitations, but only tests will reveal whether this is the case.
As an arch-to-arch comparison: the 4070Ti has effectively half the VRAM buswidth, capacity, and bandwidth, compared to the 3080. It performs basically the same or better. Ada is clearly a dramatically less memory bandwidth constrained architecture than Ampere for client workloads.
 
128 bit?? Amazing. Maybe they forgot the specs of the 3060 Ti.

And "Nvidia is now looking to make its mainstream GeForce RTX 4060 Ti part a lot more power efficient. "

"a lot more "power efficient".... Seriously??

I am beginning to think some tech writers are really working for Nvidia and not TH!!
 
lol yep

Just bought a lightbulb the other day. It uses 7 watt like any other lightbulb nowadays. My sister reminded me to turn them off when I'm not in my room to save energy.

Meanwhile this GPU will use 200+ watt and is called "efficient". Just lol, it would literally be the item using most electricity in our household. Yes our washing machine can use more, but it is only turned on for like 40 minutes twice a week. These companies live in an alternate universe where electricity is free and they haven't been paying attention to anything going on in the world.

This is completely a matter of perspective. If you down clocked a 4080 to match the performance of a 970 or a 1060 I would bet you would be far under that 160 watts. Remember that 7 watt LED isn't putting out 10000 more lumens than its siblings did, it's putting out the same 600 or so.
 
  • Like
Reactions: KyaraM
TGP implies just the power to the GPU, so the total board power will be more. From the article I could see 220W, as that was apparently an earlier rumor. But the actual numbers are up in the air.

Since there isn't likely to be an FE, it will really be up to the partners to decide how much power to give these things.