Question RX6700, 3060, or 4060

j_m_h_jm

Honorable
Oct 29, 2017
10
4
10,525
I recently upgraded to a Ryzen 5 7600 while still rocking a 1060 6 gig. My monitor caps at 1080/60 and don’t plan on changing that. I do have an old 350 watt Seasonic PSU I’d love to keep using, but realize it is unlikely. I try to be aware of power draws and would love to think something will still meet minimum specs an equal amount of time as my 1060 (2016 so 7ish years). Technically I don’t own anything that won’t run, but happen to have an opportunity to upgrade now.

I had asked about this a month or so ago and it sounded like the 6700 and 6700 XT were better overall buys than the 3060. I now feel more informed so I have different questions. We also have confirmed 4060 specs and it sounds like pretty solid 7600 details. In trying to stay under 200 watts for a GPU, <$350 US, it seems like the above are the best options. I know the 4060 is due in the summer, but has half the VRAM as the 3060 and less than the 6700. I do like the idea of the much larger cache pool offered by the 6700 and 4060, but it seems like VRAM might bring some headaches with all of the “extra” video settings, even at 1080. The leaks for the 7600 make it sound like the 7700 will both be more expensive than a 3060 is currently and increase power consumption. In trying to evaluate future functionality, it seems like we are clearly seeing smaller upgrades, but differences. Example, compared to my last two GPU purchases (Radeon 7750 and the 1060) I don’t recall cache being a selling point, VRAM growth is slowing, ray tracing is a thing, AI and such is aiding in visuals, bus speeds seem less important as a result of cache/architecture changes, etc.

Pros/Cons

3060 - most VRAM and DLSS compatible BUT slowest in non-RT games and smallest cache
4060 - likely the fastest overall, DLSS 3.0, lowest power demands, BUT least VRAM and what will it actually retail at
6700 - Middle of the pack in VRAM, cache, speed, and least expensive BUT worst RT, no DLSS, highest power consumption (XFX has one with a 175w TDP). Isn’t there an addd benefit with an AMD CPU, too?

Given I usually play “high demand” games a year or two behind, I think the VRAM might be fine - my understanding is there are only a couple games in 1080 that cap VRAM, but who knows in another five years (if I am finally playing in 6-7). Flip side, the 6700 would give more buffer in VRAM and likely be with a couple frame of the 4060 in non-RT games. While I would love to experience RT, I haven’t experienced it yet and hear we might be another of my upgrade cycles before it really will matter. My current list of games I would place over the rest of my backlog includes Miles Morales, Hogwarts Legacy, Bug Fable, Two Point Campus, and maybe Gotham Knights. I do usually get Madden or FIFA every few years for new rosters/teams. I would love to run things at max video settings for my monitor, but a fun game, I would honestly play on low in 720 - not a visuals snob, but I fully appreciate all of the eye candies!

In seems like this is probably for the 6700 to lose as it seems like the “all around choice”, but I upgrade so infrequently that it seems worth asking more educated people. I know enough to understand the lingo, but feel like I still lack knowledge to be confident in my decision. Also sounds like the 7600 and 7700 will cost more so no point in waiting except for a deal on a 6700 (or 3060 closer to July).
 
I recently upgraded to a Ryzen 5 7600 while still rocking a 1060 6 gig. My monitor caps at 1080/60 and don’t plan on changing that. I do have an old 350 watt Seasonic PSU I’d love to keep using, but realize it is unlikely. I try to be aware of power draws and would love to think something will still meet minimum specs an equal amount of time as my 1060 (2016 so 7ish years). Technically I don’t own anything that won’t run, but happen to have an opportunity to upgrade now.

I had asked about this a month or so ago and it sounded like the 6700 and 6700 XT were better overall buys than the 3060. I now feel more informed so I have different questions. We also have confirmed 4060 specs and it sounds like pretty solid 7600 details. In trying to stay under 200 watts for a GPU, <$350 US, it seems like the above are the best options. I know the 4060 is due in the summer, but has half the VRAM as the 3060 and less than the 6700. I do like the idea of the much larger cache pool offered by the 6700 and 4060, but it seems like VRAM might bring some headaches with all of the “extra” video settings, even at 1080. The leaks for the 7600 make it sound like the 7700 will both be more expensive than a 3060 is currently and increase power consumption. In trying to evaluate future functionality, it seems like we are clearly seeing smaller upgrades, but differences. Example, compared to my last two GPU purchases (Radeon 7750 and the 1060) I don’t recall cache being a selling point, VRAM growth is slowing, ray tracing is a thing, AI and such is aiding in visuals, bus speeds seem less important as a result of cache/architecture changes, etc.

Pros/Cons

3060 - most VRAM and DLSS compatible BUT slowest in non-RT games and smallest cache
4060 - likely the fastest overall, DLSS 3.0, lowest power demands, BUT least VRAM and what will it actually retail at
6700 - Middle of the pack in VRAM, cache, speed, and least expensive BUT worst RT, no DLSS, highest power consumption (XFX has one with a 175w TDP). Isn’t there an addd benefit with an AMD CPU, too?

Given I usually play “high demand” games a year or two behind, I think the VRAM might be fine - my understanding is there are only a couple games in 1080 that cap VRAM, but who knows in another five years (if I am finally playing in 6-7). Flip side, the 6700 would give more buffer in VRAM and likely be with a couple frame of the 4060 in non-RT games. While I would love to experience RT, I haven’t experienced it yet and hear we might be another of my upgrade cycles before it really will matter. My current list of games I would place over the rest of my backlog includes Miles Morales, Hogwarts Legacy, Bug Fable, Two Point Campus, and maybe Gotham Knights. I do usually get Madden or FIFA every few years for new rosters/teams. I would love to run things at max video settings for my monitor, but a fun game, I would honestly play on low in 720 - not a visuals snob, but I fully appreciate all of the eye candies!

In seems like this is probably for the 6700 to lose as it seems like the “all around choice”, but I upgrade so infrequently that it seems worth asking more educated people. I know enough to understand the lingo, but feel like I still lack knowledge to be confident in my decision. Also sounds like the 7600 and 7700 will cost more so no point in waiting except for a deal on a 6700 (or 3060 closer to July).
In their 4060Ti review, Tomshardwas has the 4060Ti around the performance of the 3070FE. From a pure resource perspective the 4060 only has 70% of the CUDA cores as the 4060 Ti. My guess is that it will only have 70% the performance which means it will only have the performance of the RTX 3060, probably a little more than that but not much. Since you are looking at the RX6700 also look at the RX6700XT. It is only about 5% slower than the 4060Ti you can get it starting at $320. I have the ASRock Challenger D OC and it is great and is $340.
 
May 8, 2023
10
1
15
In their 4060Ti review, Tomshardwas has the 4060Ti around the performance of the 3070FE. From a pure resource perspective the 4060 only has 70% of the CUDA cores as the 4060 Ti. My guess is that it will only have 70% the performance which means it will only have the performance of the RTX 3060, probably a little more than that but not much. Since you are looking at the RX6700 also look at the RX6700XT. It is only about 5% slower than the 4060Ti you can get it starting at $320. I have the ASRock Challenger D OC and it is great and is $340.
While I appreciate this idea and it is within budget, my understanding is that it is not particularly energy efficient, tends to run hot, etc. I guess I'm scarred from my previous build - I had an AMD Phenom 9950 Black Edition with like a nvidia 9600 GT or something. It turned any room into the hottest room in the house just browsing the internet, never mind gaming. If memory serves, the CPU was like 140 watt and the GPU was only 95 watt or something? I realize a direct compare of wattage isn't valid from 15 years ago due to efficiencies of technology, but hearing something can 'run hot' is not a positive memory to me. Thankfully, my computer is now in an unfinished basement so it isn't like it in an upstairs that already gets hot and this will make it unbearable.
 
While I appreciate this idea and it is within budget, my understanding is that it is not particularly energy efficient, tends to run hot, etc. I guess I'm scarred from my previous build - I had an AMD Phenom 9950 Black Edition with like a nvidia 9600 GT or something. It turned any room into the hottest room in the house just browsing the internet, never mind gaming. If memory serves, the CPU was like 140 watt and the GPU was only 95 watt or something? I realize a direct compare of wattage isn't valid from 15 years ago due to efficiencies of technology, but hearing something can 'run hot' is not a positive memory to me. Thankfully, my computer is now in an unfinished basement so it isn't like it in an upstairs that already gets hot and this will make it unbearable.
In terms of energy efficiency the Radeon 6000 series is really good compared to the RTX 3000. https://www.tomshardware.com/reviews/amd-radeon-rx-6700-xt-review/4

From my own experience with the ASRock Challenger I don't notice it being noisy or making the room hotter. I went from a Power Color R9 285 > ASRock 6700XT and the old R9 285 was much louder.
 
I recently upgraded to a Ryzen 5 7600 while still rocking a 1060 6 gig. My monitor caps at 1080/60 and don’t plan on changing that. I do have an old 350 watt Seasonic PSU I’d love to keep using, but realize it is unlikely. I try to be aware of power draws and would love to think something will still meet minimum specs an equal amount of time as my 1060 (2016 so 7ish years). Technically I don’t own anything that won’t run, but happen to have an opportunity to upgrade now.
Yeah, that Seasonic 350W won't be enough. A 350W PSU is of limited use, even when it's a fantastic brand like Seasonic. Keep it as a backup with your GTX 1060 just in case you ever need them. Then if something goes wrong with your PSU or video card, you won't be without a PC and they can be invaluable diagnostic tools.
I had asked about this a month or so ago and it sounded like the 6700 and 6700 XT were better overall buys than the 3060. I now feel more informed so I have different questions.
Being informed is the best thing when spending a decent sum of money like this. (y)
We also have confirmed 4060 specs and it sounds like pretty solid 7600 details. In trying to stay under 200 watts for a GPU, <$350 US, it seems like the above are the best options.
I would agree. At that price point, they are definitely the best options.
I know the 4060 is due in the summer, but has half the VRAM as the 3060 and less than the 6700. I do like the idea of the much larger cache pool offered by the 6700 and 4060, but it seems like VRAM might bring some headaches with all of the “extra” video settings, even at 1080.
Where did you get that idea? Having more VRAM never causes headaches, it only stops them from happening. I should know, my card has 16GB of VRAM. All that having more VRAM means is that you can do more with the card and for far longer. Where did you hear/read that having more VRAM causes headaches?
The leaks for the 7600 make it sound like the 7700 will both be more expensive than a 3060 is currently and increase power consumption.
There's no reason to get a 7000-series card for your purposes.
In trying to evaluate future functionality, it seems like we are clearly seeing smaller upgrades, but differences. Example, compared to my last two GPU purchases (Radeon 7750 and the 1060) I don’t recall cache being a selling point, VRAM growth is slowing, ray tracing is a thing, AI and such is aiding in visuals, bus speeds seem less important as a result of cache/architecture changes, etc.

Pros/Cons

3060 - most VRAM and DLSS compatible BUT slowest in non-RT games and smallest cache
You need to be more specific because just saying "slowest in non-RT games" doesn't really say anything. The fact is that, at this pricing level, RT shouldn't even be a thought. Even though, certainly the RTX 3060 will take less of a performance hit with RT turned on, it will still be so bad that you'll turn it off anyway which makes it pointless. For RT to be relevant, you need at least an RTX 3070 (which can be hobbled by its 8GB frame buffer).

At your price point, the performance hit caused by RT makes it almost unusable (unless you want to have to use DLSS from day one). Meanwhile, the RX 6700 is still a whopping 19% faster than the RTX 3060. At your price point, a card's performance matters far more than its RT resilience.

Sure, the RTX 3060 has an extra 2GB of VRAM but that's only because nVidia made the card specifically for miners, not gamers. The VRAM in the RTX 3060 will out-live its relatively weak GPU while the 10GB in the RX 6700 is much better matched with the potency of its GPU.

You want balance because if the VRAM and the GPU become obsolete at about the same time, it means that you didn't waste any money on either. It's like trying to match a CPU and GPU in a gaming rig. The more matched that they are, the better spent your money was because you'll get maximum performance from both. Of course, a perfect match is impossible, but we still try to get as close as possible.
4060 - likely the fastest overall,
I really doubt that. You see, nVidia claims that the RTX 4060 will be 20% faster than the RTX 3060. Meanwhile, the RX 6700 has been proven to be 19% faster than the RTX 3060. No manufacturer's claims are ever as good as actual reality so that 20% is probably only in certain games. The RTX 4060 might actually only be 10-15% faster than the RTX 3060 across all games which would actually make it slower than the RX 6700.

Even if nVidia's being 100% honest (I couldn't type that with a straight face...lol), then we're talking about it being only 1% faster than the RX 6700 and anything within 3% is considered a tie.

Therefore, the best-case scenario for the RTX 4060 is that it's tied in performance with the RX 6700. No matter how you look at it, the performance of the RTX 4060 will not be an advantage over the RX 6700.
DLSS 3.0,
From what I hear, DLSS 3.0 is a bit of a mixed bag. Apparently the frame generation works really well but only if you already have around 60FPS native (which I think defeats the purpose of it).
lowest power demands,
This is true and is definitely a plus but the question is, just how much of a plus? I mean, sure, the RTX 4060 has a TDP that's 60W lower than that of the RX 6700 but we won't know what that actually means until it's released because TDP is actually a terrible way of judging power consumption. We only use it because, as bad as it is, it's all we really have until reviews come out. It's a little more accurate than trying to judge gaming performance by counting TeraFlOps but not by much.

Consider that the TDP of the Intel i9-13900K is only 125W but Tom's Hardware managed to get it to consume 199W without overclocking it. Those two numbers aren't even remotely close.
BUT least VRAM and what will it actually retail at
The RTX 4060's MSRP will be $400 for the 8GB model and, a month later, $500 for the 16GB model. This makes it a bad matchup overall against the RX 6700.

The RX 6700 XT beats the RTX 4060 8GB because:
  • The performance will be the same or better
  • The RX 6700 XT has 25% more VRAM
  • The RX 6700 XT is $120 less expensive
The only thing that the RTX 4060 has in its favour is lower power-draw and just how long do you think that it would take for it to save you $120 on your hydro bill and manage to just break even?

I would guess about five years.
6700 - Middle of the pack in VRAM
For 1080p60Hz, 10GB is all that you'll need for the life of these cards. Hell, for the most part, 8GB is enough for 1080p60Hz in all but one or two games but who wants to pay an extra $120 for 2GB less VRAM?
6700 - Middle of the pack in cache,
Nope. It has far more cache than either the RTX 3060 or RTX 4060. One of the big features of RDNA2 was the use of what AMD calls the Infinity Cache.

You know, I'm honestly starting to wonder where you're getting your "information" from because so much of it is completely wrong.
RTX 3060:
L1 Cache: 128 KB (per SM)
L2 Cache: 3 MB

RTX 4060:
L1 Cache: 128 KB (per SM)
L2 Cache: 24 MB

RX 6700:
L0 Cache: 32 KB per WGP
L1 Cache: 128 KB per Array
L2 Cache: 3 MB
L3 Cache: 80 MB
6700 - Middle of the pack in speed,
No, it will be at least tied with the RTX 4060 in speed.
and least expensive BUT worst RT,
Which only matters once you hit the performance level of the RTX 3070. We're still below that.
No, but by the time you need it, FSR will have progressed to the point that they'd be indistinguishable. This is especially true with the two faster cards.
highest power consumption (XFX has one with a 175w TDP). Isn’t there an addd benefit with an AMD CPU, too?
There's Smart Access Memory, but it doesn't really do much.
Given I usually play “high demand” games a year or two behind, I think the VRAM might be fine - my understanding is there are only a couple games in 1080 that cap VRAM, but who knows in another five years (if I am finally playing in 6-7). Flip side, the 6700 would give more buffer in VRAM and likely be with a couple frame of the 4060 in non-RT games. While I would love to experience RT, I haven’t experienced it yet and hear we might be another of my upgrade cycles before it really will matter.
I have to agree with you there. My RX 6800 XT is very capable of RT (especially at 1080p) but I find that 1440p with RT off looks better than 1080p with RT on. When hardware has advanced enough that it can support really good implementations of RT, then yeah, that's what I'd be most interested in it. The way I see it now is that people are paying extra for a tantalising little peek at what is to come.
My current list of games I would place over the rest of my backlog includes Miles Morales, Hogwarts Legacy, Bug Fable, Two Point Campus, and maybe Gotham Knights. I do usually get Madden or FIFA every few years for new rosters/teams. I would love to run things at max video settings for my monitor, but a fun game, I would honestly play on low in 720 - not a visuals snob, but I fully appreciate all of the eye candies!
Well, from what you're telling us, I think that you'd even be happy with an RX 6600. I think that you'd be absolutely ecstatic with an RX 6700 and it would last you for at least 5 years. I would actually be willing to bet more on 7-8 years if the card doesn't fail and you still game at the same settings that you do now.
In seems like this is probably for the 6700 to lose as it seems like the “all around choice”, but I upgrade so infrequently that it seems worth asking more educated people. I know enough to understand the lingo, but feel like I still lack knowledge to be confident in my decision. Also sounds like the 7600 and 7700 will cost more so no point in waiting except for a deal on a 6700 (or 3060 closer to July).
I can tell you that if you get the RX 6700, many years from now, you'll look back on this conversation and think to yourself "Man, they really DID know what they were talking about!".
 

Karadjgne

Titan
Ambassador
The 6700xt 12Gb on average is @ 8-10% lower than the 4060ti 8Gb. You can get a 6700xt for @ $300.
10% might sound like a lot of performance, but it's 10fps in games around that 100fps mark, so still way above your refresh on the monitor, so in effect, the same thing.

And still better than the 4060 for roughly the same price, maybe less.

The only really clear win for nvidia with the 40 series cards is efficiency, but considering you'll need to replace the psu anyway the difference is moot.
 

j_m_h_jm

Honorable
Oct 29, 2017
10
4
10,525
Yeah, that Seasonic 350W won't be enough. A 350W PSU is of limited use, even when it's a fantastic brand like Seasonic. Keep it as a backup with your GTX 1060 just in case you ever need them. Then if something goes wrong with your PSU or video card, you won't be without a PC and they can be invaluable diagnostic tools.

Being informed is the best thing when spending a decent sum of money like this. (y)

I would agree. At that price point, they are definitely the best options.

Where did you get that idea? Having more VRAM never causes headaches, it only stops them from happening. I should know, my card has 16GB of VRAM. All that having more VRAM means is that you can do more with the card and for far longer. Where did you hear/read that having more VRAM causes headaches?

There's no reason to get a 7000-series card for your purposes.

You need to be more specific because just saying "slowest in non-RT games" doesn't really say anything. The fact is that, at this pricing level, RT shouldn't even be a thought. Even though, certainly the RTX 3060 will take less of a performance hit with RT turned on, it will still be so bad that you'll turn it off anyway which makes it pointless. For RT to be relevant, you need at least an RTX 3070 (which can be hobbled by its 8GB frame buffer).

At your price point, the performance hit caused by RT makes it almost unusable (unless you want to have to use DLSS from day one). Meanwhile, the RX 6700 is still a whopping 19% faster than the RTX 3060. At your price point, a card's performance matters far more than its RT resilience.

Sure, the RTX 3060 has an extra 2GB of VRAM but that's only because nVidia made the card specifically for miners, not gamers. The VRAM in the RTX 3060 will out-live its relatively weak GPU while the 10GB in the RX 6700 is much better matched with the potency of its GPU.

You want balance because if the VRAM and the GPU become obsolete at about the same time, it means that you didn't waste any money on either. It's like trying to match a CPU and GPU in a gaming rig. The more matched that they are, the better spent your money was because you'll get maximum performance from both. Of course, a perfect match is impossible, but we still try to get as close as possible.

I really doubt that. You see, nVidia claims that the RTX 4060 will be 20% faster than the RTX 3060. Meanwhile, the RX 6700 has been proven to be 19% faster than the RTX 3060. No manufacturer's claims are ever as good as actual reality so that 20% is probably only in certain games. The RTX 4060 might actually only be 10-15% faster than the RTX 3060 across all games which would actually make it slower than the RX 6700.

Even if nVidia's being 100% honest (I couldn't type that with a straight face...lol), then we're talking about it being only 1% faster than the RX 6700 and anything within 3% is considered a tie.

Therefore, the best-case scenario for the RTX 4060 is that it's tied in performance with the RX 6700. No matter how you look at it, the performance of the RTX 4060 will not be an advantage over the RX 6700.

From what I hear, DLSS 3.0 is a bit of a mixed bag. Apparently the frame generation works really well but only if you already have around 60FPS native (which I think defeats the purpose of it).

This is true and is definitely a plus but the question is, just how much of a plus? I mean, sure, the RTX 4060 has a TDP that's 60W lower than that of the RX 6700 but we won't know what that actually means until it's released because TDP is actually a terrible way of judging power consumption. We only use it because, as bad as it is, it's all we really have until reviews come out. It's a little more accurate than trying to judge gaming performance by counting TeraFlOps but not by much.

Consider that the TDP of the Intel i9-13900K is only 125W but Tom's Hardware managed to get it to consume 199W without overclocking it. Those two numbers aren't even remotely close.

The RTX 4060's MSRP will be $400 for the 8GB model and, a month later, $500 for the 16GB model. This makes it a bad matchup overall against the RX 6700.

The RX 6700 XT beats the RTX 4060 8GB because:
  • The performance will be the same or better
  • The RX 6700 XT has 25% more VRAM
  • The RX 6700 XT is $120 less expensive
The only thing that the RTX 4060 has in its favour is lower power-draw and just how long do you think that it would take for it to save you $120 on your hydro bill and manage to just break even?

I would guess about five years.

For 1080p60Hz, 10GB is all that you'll need for the life of these cards. Hell, for the most part, 8GB is enough for 1080p60Hz in all but one or two games but who wants to pay an extra $120 for 2GB less VRAM?

Nope. It has far more cache than either the RTX 3060 or RTX 4060. One of the big features of RDNA2 was the use of what AMD calls the Infinity Cache.

You know, I'm honestly starting to wonder where you're getting your "information" from because so much of it is completely wrong.
RTX 3060:
L1 Cache: 128 KB (per SM)
L2 Cache: 3 MB

RTX 4060:
L1 Cache: 128 KB (per SM)
L2 Cache: 24 MB

RX 6700:
L0 Cache: 32 KB per WGP
L1 Cache: 128 KB per Array
L2 Cache: 3 MB
L3 Cache: 80 MB

No, it will be at least tied with the RTX 4060 in speed.

Which only matters once you hit the performance level of the RTX 3070. We're still below that.

No, but by the time you need it, FSR will have progressed to the point that they'd be indistinguishable. This is especially true with the two faster cards.

There's Smart Access Memory, but it doesn't really do much.

I have to agree with you there. My RX 6800 XT is very capable of RT (especially at 1080p) but I find that 1440p with RT off looks better than 1080p with RT on. When hardware has advanced enough that it can support really good implementations of RT, then yeah, that's what I'd be most interested in it. The way I see it now is that people are paying extra for a tantalising little peek at what is to come.

Well, from what you're telling us, I think that you'd even be happy with an RX 6600. I think that you'd be absolutely ecstatic with an RX 6700 and it would last you for at least 5 years. I would actually be willing to bet more on 7-8 years if the card doesn't fail and you still game at the same settings that you do now.

I can tell you that if you get the RX 6700, many years from now, you'll look back on this conversation and think to yourself "Man, they really DID know what they were talking about!".
Thanks. Sorry, I have been comparing cards so much I forget the cache on the 6700. I was referencing the 4060 had less VRAM than the 6700.

Thank you everyone. Much like when I got my 1060, I was fully set for a 1050, but had a little extra budget and saw a deal I couldn’t pass up. It looks like pricing is so weird right now, if I filter PPP to the 6700 and 6700 XT, cheapest is a 6700, then a 6700 XT, and then several more 6700s. With reviews showing the 6750 XT being very comparable to a 4060 Ti, I suppose why not? The only reason for ignoring the XT was power consumption, but I suppose there are plenty of ways to mitigate that as well as it won’t be maxing out capacity in a lot of current games I have to play.
 
  • Like
Reactions: Avro Arrow
Thanks. Sorry, I have been comparing cards so much I forget the cache on the 6700. I was referencing the 4060 had less VRAM than the 6700.

Thank you everyone. Much like when I got my 1060, I was fully set for a 1050, but had a little extra budget and saw a deal I couldn’t pass up. It looks like pricing is so weird right now, if I filter PPP to the 6700 and 6700 XT, cheapest is a 6700, then a 6700 XT, and then several more 6700s. With reviews showing the 6750 XT being very comparable to a 4060 Ti, I suppose why not? The only reason for ignoring the XT was power consumption, but I suppose there are plenty of ways to mitigate that as well as it won’t be maxing out capacity in a lot of current games I have to play.
TDP on the 6700XT is 220W. During their gaming loop testing it topped out at 215W. That isn't bad at all for a card that gives good 1440p framerates.
 
  • Like
Reactions: Avro Arrow
Thanks. Sorry, I have been comparing cards so much I forget the cache on the 6700. I was referencing the 4060 had less VRAM than the 6700.

Thank you everyone. Much like when I got my 1060, I was fully set for a 1050, but had a little extra budget and saw a deal I couldn’t pass up. It looks like pricing is so weird right now, if I filter PPP to the 6700 and 6700 XT, cheapest is a 6700, then a 6700 XT, and then several more 6700s. With reviews showing the 6750 XT being very comparable to a 4060 Ti, I suppose why not? The only reason for ignoring the XT was power consumption, but I suppose there are plenty of ways to mitigate that as well as it won’t be maxing out capacity in a lot of current games I have to play.
I have to apologise to you because it turns out that some of the things I said were inaccurate.

When I typed "RTX 4060 MSRP" in Google, what popped up was the price of the RTX 4060 Ti. The RTX 4060's MSRP is $300, not $400. That's still not good but it's not nearly as bad as it appeared.

Also, this:
"The RX 6700 XT beats the RTX 4060 8GB because:
  • The performance will be the same or better
  • The RX 6700 XT has 25% more VRAM
  • The RX 6700 XT is $120 less expensive"
I hadn't meant to have the XT there. You see that I say the RX 6700 XT has 25% more VRAM, well, no, the XT has 50% more VRAM. I do this when I'm at work and stuck at my desk with nothing to do so on occasion I make a mistake that I don't notice. I was describing the RX 6700 here and, again, I was working with the wrong MSRP. The RX 6700 isn't $120 less expensive, it's only $20 less expensive.

I say this because I really do want you to have the best information possible to help you make the right choice. Sometimes, that means eating a bit of crow but that's not a big deal to me, telling you the truth is. However, there is another truth that I came across concerning the RTX 4060 Ti last night when I was watching Steve Burke on Gamers Nexus. Gamers Nexus is the best hardware testing channel on YouTube because they have no fear of anyone.

Anyway, Steve ripped right into the RTX 4060 Ti, showing how it had almost no performance advantage over its predecessor, the RTX 3060 Ti:
View: https://www.youtube.com/watch?v=Y2b0MWGwK_U
Steve calls the RTX 4060 Ti "the absolute worst GPU launch that we've seen in years" and "if not for the AV1 encoder, we'd be calling it a refresh of the RTX 3060 Ti". If the RTX 4060 Ti is that bad, there's a very good chance that the RTX 4060 will be the same or worse.

Let's be honest here, it's not like nVidia has released anything overly-impressive since the RTX 4090. Everything that they've released since has been, well, not necessarily bad, but horribly over-priced and not competitive with RDNA2. Steve points out that buying an 8GB card right now is a bad idea and then also points out that the 16GB version of the RTX 4060 Ti is at the same price point as the Radeon RX 6800 XT, a card that just crushes it.

To be fair, I haven't been the least bit impressed with RDNA3 either, but, as usual, nVidia has been far more egregious with their conduct than AMD because they know that the sheep will always buy nVidia. I can't say that they're wrong.
 
Last edited:
  • Like
Reactions: j_m_h_jm

Karadjgne

Titan
Ambassador
View: https://youtu.be/SIugY8lDJhY

Where there's one review, there's always another. This shows if you really care about benchmarks, the 4060ti is well ahead of the competition, including the 3060ti. But DB also adds in the DLSS 3 numbers in his games vs the Amd versions and it still comes out ahead, mostly, generally beating the 2080ti, 6700xt etc.

But that'd still put the 4060 in almost direct competition with a 2070Super, only getting better scores on games that'll take good advantage of DLSS 3 over DLSS.
 
The power consumption shows to be about 25% higher on the RX 7600 over the RX 6600... that's a bummer.
on high clocks it draws more, same for 6000series
im on RX 6800 and if i oveclock it, it gets 10% more fps at 30% more power draw, on the other hand if i undervolt it, it doesnt drop any fps and power draw is much lower
anyway i play at 1440p/60 epic/ultra whatnot and gpu draws around 100 watts, on few nextgen games it does draw around 120-140watts, but those are just few games (jedi survivor for example), tons of last year games are under 100watts
 

j_m_h_jm

Honorable
Oct 29, 2017
10
4
10,525
on high clocks it draws more, same for 6000series
im on RX 6800 and if i oveclock it, it gets 10% more fps at 30% more power draw, on the other hand if i undervolt it, it doesnt drop any fps and power draw is much lower
anyway i play at 1440p/60 epic/ultra whatnot and gpu draws around 100 watts, on few nextgen games it does draw around 120-140watts, but those are just few games (jedi survivor for example), tons of last year games are under 100watts
Do you have a good guide for undervolting for first timers? The 6700XT I ordered will be in my computer tonight or tomorrow depending on my time available 😂 Don’t know how much time I will have tonight to install it and a new PSU let alone gaming.
 
  • Like
Reactions: Avro Arrow
Do you have a good guide for undervolting for first timers? The 6700XT I ordered will be in my computer tonight or tomorrow depending on my time available 😂 Don’t know how much time I will have tonight to install it and a new PSU let alone gaming.
you can use msi afterburner, its fairly simple there, you just reduce voltage and run some benchmark like 3d mark to see if its stable, if you start to get black screen/driver reset, then increase a little

mine is 920mV at 2.4GHz
 
Do you have a good guide for undervolting for first timers? The 6700XT I ordered will be in my computer tonight or tomorrow depending on my time available 😂 Don’t know how much time I will have tonight to install it and a new PSU let alone gaming.
If I were you, I would look into whether or not undervolting would void the warranty (because it probably does). If I were you, I would just run it at stock settings for the warranty period and then tinker with it once the warranty is up.

Undervolting a card sometimes involves flashing the card's firmware and that is not something for the faint of heart. I've known how to undervolt and overclock CPUs for years but I have only once had the courage to mess with a video card's firmware (RX 5700 XT) and that's only because I already had the RX 6800 XT to replace it.

It worked... well, sort of. It did reduce power draw but sometimes the GPU became unstable. That kinda freaked me out and I flashed the BIOS back to stock settings. It wasn't worth it on a triple-fan card because they run cool and quiet already. I don't know how many fans your card has.

If you do it with Afterburner, you might get away with it.
 
Last edited:
What’s that mean for power consumption? It is much less or rounding error
just ran furmark:
stock settings is 1.025V at 2359MHz boost
due to power limit 215watts GPU reached only 2100MHz at 0.9V
furmark fps 195

undervolted to 0.920V at 2359MHz boost
due to power limit 215watts GPU reached only 2200MHz at 0.89V
furmark FPS 210

limited GPU to 2100MHz to hit same frequency under furmark which was previsously TDP limited + undervolt
wattage dropped to 185watts
furmark fps 195
power vs fps = 0.95

powerlimit can be increased to 245watts on this card...but theres not much benefit in doing so, 215fps / 245watts, clock howers slightly above 2200MHz
power vs fps 1.14

for furmark thats 20% less efficient when overclockin...

3d mark can go 2.4Ghz on this card, and theres 30% power difference


edit almost forgot
furmark 1440p frame limited to 60fps:
undervolted 65watts
stock 75watts
 
Last edited:

Karadjgne

Titan
Ambassador
Gotta read between the lines. Take @kerberos_20 results for instance. 1st is stock, 2nd is undervolt for increase in fps but same heat output, 215w. Stock vs 3rd, major reduction in heat output, 215w vs 185w, but exactly the same fps. Stock vs 4th is pure added heat, 215w vs 245w, but also fps gain.

So depending on need or requirements, you can adjust your card to fit. Maybe you have a lousy airflow case, so the gpu runs super hot constantly, you'd benefit most by 3rd, power use drop, which lowers heat output, raising boosts. Or maybe you have a super chill case and no thermal worries, and want highest fps, 4th.

Fps is not the only metric used for performance, just the most common.