Question What GPU should I get, 4070 or the 7900 XT?

Aug 16, 2023
6
0
10
Hello guys,

So I’ve been looking around for a brand new PC for streaming, but I cannot choose between the 4070 or the RX 7900 XT. Price is not really the main issue but the thing I’m stuck with is the AMD encoder for Twitch streaming, so I do not really know what to buy!

I will be pairing the GPU with a Ryzen 7 7800X3D which is a great fit for both the Graphic Cards, please let me know!
 
Thanks for the answer! I am just worried about the video encoder for streaming, because NVENC to my knowledge is waaay better than the current AMD encoder. Twitch also does not support AV1, and don’t really know when it will!
 
i dont know much about Nvidia cards and video encoding but a quick google search i found the 4070 has only one NVENC encoder where as the 4070ti has 2 now what that means in terms of video encoding i have no clue.
Hope that helps ..

If for gaming and longer life id pick the 7900xt with its 20gb vram all day long over the 4070 and even the 4070ti but obviously if Nvidia has the better tool set for your needs go the Nvidia card but be aware 2 sites have said the 4070 only has one NVENC encoder not 2 !!
 
  • Like
Reactions: zakaa73
It really depends on what's more important to you. Sure, Twitch only supports nVenc right now, but it won't be long before it supports AV1 like everyone else.

If you care more about Twitch then get the GeForce card. If you care more about gaming performance and longevity, get the Radeon card.
 
  • Like
Reactions: zakaa73
7900 has better bandwidth and more VRAM.

Since you said price doesn't matter that much, it's the logical choice if it's between those two cards.

As for the encoder, sure Nvidia might be better overall but even if you are streaming, who are you buying the card for, you or the streamers who will be watching on their comparatively small phone screens where it won't matter anyway?

I own both, I'm partial to Nvidia but think with what you've provided AMD is the better choice regardless of the encoder.
 
Last edited:
For twitch streaming, Nvidia is the way to go.

What resolution monitor are you using?

And i am guessing since you stream, you want the best ray tracing performance as well if at all.

If you are using 1080p, 4070 is enough. If you are using 2k monitor, i would suggest looking at 4080 or atleast 4070ti.
 
Thanks for all the answers! Does the RX also use way more power than the 4070, or is that just a myth?
It's no myth. The RX 7900 XT uses an extra 120W over the RTX 4070. To be fair though, the RTX 4070 isn't in the same tier as the RX 7900 XT as the RX 7900 XT is a whopping 38% faster than the RTX 4070.

The RTX 4070 Ti is the real competitor to the RX 7900 XT and it uses an extra 80W over the RTX 4070. So comparing the RX 7900 XT to a card that is in a lower performance tier isn't exactly a fair comparison because it's 11% faster than the RTX 4070 To, the card that's one tier higher than the RTX 4070.

Having said that, the RX 7900 XT still uses 50W more than the RTX 4070 Ti but that's not a big difference for two cards in the high-end performance tier.

Even with the 120W gaming difference, of you lived in the UK, where electricity is VERY expensive, you're looking at only 4p extra per hour which is about 5¢. In the USA, you'd be looking at a difference in cost of a "whopping" 2¢/hr. Here in Canada, it's half of that at 1¢/hr. Not really enough to be concerned about, eh? It's far worse to have a more power-hungry CPU than GPU because the CPU is ALWAYS trying to run as fast as possible while the GPU is often performing tasks that don't draw a lot of power like anything that isn't high-end gaming, encoding or running benchmarks. Playing a video for instance, well, you can do this with an integrated GPU just fine as this is considered child's play for even the weakest GPUs in existence. Most applications have your GPU almost at idle, but not your CPU.

It's a simple fact that the further up the stack you go, the not juice you're going to pull, like a Mustang with a V6 or V8. In this case, you're comparing an L4 to a V8.

The Navi 31 GPU on which the 7900 XT is based apparently undervolts really well and while you can do the same with the 4070, you'd get more of a drop from the 7900 XT just based on the laws of averages. It still won't be as low as the GeForce card but it won't be as bad as it is now.

I just bought an RX 7900 XTX but I'm on vacation in Montreal so I won't get to tinker with it until Saturday night.
 
Last edited:
I mean is the difference between AMDs encoder and NVENC that big, or is it pretty negligible?
It's definitely there, I won't lie. It's not huge but not negligible either. Having said that, I believe that once Twitch adopts AV1 (and they WILL adopt AV1), it will no longer matter.

As for when that will happen, I really don't know but I can't imagine that it would be longer than a few months.
 
Last edited:
  • Like
Reactions: Lucky_SLS
It's definitely there, I won't lie. It's not huge but not negligible either. Having said that, I believe that once Twitch adopts AV1 (and they WILL adopt AV1), it will no longer matter.

As for when that will happen, I really don't know but I can't imagine that it would be longer than a few months.
Do you think it is better for me to buy the card right now, and just wait for the AV1! In the meantime just using the AMD encoder?
 
It's no myth. The RX 7900 XT uses an extra 120W over the RTX 4070. To be fair though, the RTX 4070 isn't in the same tier as the RX 7900 XT as the RX 7900 XT is a whopping 38% faster than the RTX 4070.

The RTX 4070 Ti is the real competitor to the RX 7900 XT and it uses an extra 80W over the RTX 4070. So comparing the RX 7900 XT to a card that is in a lower performance tier isn't exactly a fair comparison because it's 11% faster than the RTX 4070 To, the card that's one tier higher than the RTX 4070.

Having said that, the RX 7900 XT still uses 50W more than the RTX 4070 Ti but that's not a big difference for two cards in the high-end performance tier.

Even with the 120W gaming difference, of you lived in the UK, where electricity is VERY expensive, you're looking at only 4p extra per hour which is about 5¢. In the USA, you'd be looking at a difference in cost of a "whopping" 2¢/hr. Here in Canada, it's half of that at 1¢/hr. Not really enough to be concerned about, eh? It's far worse to have a more power-hungry CPU than GPU because the CPU is ALWAYS trying to run as fast as possible while the GPU is often performing tasks that don't draw a lot of power like anything that isn't high-end gaming, encoding or running benchmarks. Playing a video for instance, well, you can do this with an integrated GPU just fine as this is considered child's play for even the weakest GPUs in existence. Most applications have your GPU almost at idle, but not your CPU.

It's a simple fact that the further up the stack you go, the not juice you're going to pull, like a Mustang with a V6 or V8. In this case, you're comparing an L4 to a V8.

The Navi 31 GPU on which the 7900 XT is based apparently undervolts really well and while you can do the same with the 4070, you'd get more of a drop from the 7900 XT just based on the laws of averages. It still won't be as low as the GeForce card but it won't be as bad as it is now.

I just bought an RX 7900 XTX but I'm on vacation in Montreal so I won't get to tinker with it until Saturday night.
Damn thanks for the answer, I think I’ll be picking up the 7900 XT since the 4070 TI is more expensive where I live!