News Leaker Suggests RTX 4060 Ti Has 160W TGP

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I think they also tend to forget the increase in transistor density. It uses more power, because there is more to power. And those increased memory speeds don't come free either.

If you want low TDP/max efficiency, then you look at the binned mobile chips. The desktop GPUs are really the leaky cast offs.

All GA104 6144 Cuda Cores:
RTX 3070Ti = 290W @ 21.75 TFLOPS
RTX 3080M = 150W @ 21.01 TFLOPS
RTX 3080M = 80W @ 16.59 TFLOPS

So you can cut the power in half and get the same rough performance, just need the right chip.

Those RTX3060 coming out of China that are actually re-tooled laptop GPUs are actually really interesting, it would be cool to have some super efficient options like that. Something I might actually pay extra for.

That or I want motherboards to start accepting MXM cards, or whatever future equivalent. I still think it would be really cool to have a GPU mount on like the back of a motherboard or something. No need for riser cables in SFF chassis, just slap a heatsink on both sides of the motherboard, or sandwich between two water blocks.
 
  • Like
Reactions: Roland Of Gilead
TGP implies just the power to the GPU, so the total board power will be more. From the article I could see 220W, as that was apparently an earlier rumor. But the actual numbers are up in the air.

Since there isn't likely to be an FE, it will really be up to the partners to decide how much power to give these things.
TGP is total graphics card power. All the other RTX 40 cards so far have had their max power consumption come in under rated TGP (except for power viruses like Furmak, where they went up to ~5% over), I see no reason to expect differently from the 4060 Ti.

Of course, as you say, AIB partners could put out factory OCd models with higher rated TGPs. But I'd still expect power draw to be <= rated TGP during regular use.
 
Last edited:
  • Like
Reactions: KyaraM
A lot more power efficient than the already overclock 4090 (at stock) with it massive power requirement? (When a card can have it power cut by 50% and only lose 8% something is up)
 
Wait. What? A 1070 at that core clock?

My 3060ti draws 200w with mild OC. Never more even at 99%.

The 4060ti TGP seems low enough, but with the bus width and potentially less cuda cores, this seems a step back. I wonder what kind of performance it will yield over it's predecessor,
Maybe I get lucky when buying cards. Most have been good overclockers, except for the EVGA 960 FTW.
All are cooled well and never go above 75c when 100% loaded.
 
lol yep

Just bought a lightbulb the other day. It uses 7 watt like any other lightbulb nowadays. My sister reminded me to turn them off when I'm not in my room to save energy.

Meanwhile this GPU will use 200+ watt and is called "efficient". Just lol, it would literally be the item using most electricity in our household. Yes our washing machine can use more, but it is only turned on for like 40 minutes twice a week. These companies live in an alternate universe where electricity is free and they haven't been paying attention to anything going on in the world.
You're the one living in an alternate universe where you're forced to buy a 200+ watt GPU and to play games.

Gaming is a privilege, not a right nor a necessity!

You can always get hired into Nvidia or AMD and show them how to create GPUS that deliver amazing performance per watt. I mean it has to be easy and they're just stringing us along, right?
 
  • Like
Reactions: KyaraM
Where did this 250W number come from?
Pulled out of his butt. The number is not stated anywhere, and Ada seems to retain or drop actual power consumption compared to RTX3000, so there is absolutely no backing to that guy's claim anywhere, at all.

Also, seriously people, at the very least read what is written in the bloody article before you all go completely gung-ho about this!!!
It states very clearly that this figure is in relation to previous reports of the card drawing 200-220W. With no word does the leak state ANYWHERE that 160W is low power, or power efficient. It is all in relation to the previous report. Also, energy efficiency is also a question of performance, so even a card that draws more power can be more efficient than a card with lower consumption simply by its higher performance. However, even the 3060Ti clocks in at a higher TGP, so even there the claim would be correct, so why do you all act like this? Just for the sake of getting up in arms? because some comments here honestly read like it. Seriously. Calm. Down.

@ GTX1070 clock speeds. Mine runs in the 1950 range, too, with overclocks over 2000MHz possible. I don't think that's strange for those cards tbh.

(Edited because GTX1070 is apparently a user name, lmao...)
 
Last edited:
500 Fewer CUDA cores and half the memory bandwidth of the 3060Ti.. Once again Nvidia have excelled themselves, doing the absolute minimum they can, I'm sure they will ensure it is marginally faster than the 3060Ti but imagine what it could have been if they had just kept those specs the same. Daylight robbery as I suspect we will be see it well north of a $500 MSRP (and real world even higher)