Review AMD Ryzen 9 3900X and Ryzen 7 3700X Review: Zen 2 and 7nm Unleashed

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Rumors, rumors, rumors.

Rumors said Ryzen 5 3600 would be have 8 cores but it doesn't.

The rumors might have some validity, maybe it's a BIOS issue or something that AMD just needed some extra time to iron out, but so far I've seen mulriple reviewers noting their review sample 3900x and 3700x never actually reach the max boost in any realistic setup.
 
That's because you don't have a 2080ti. If you did, the GPU would be demanding 400fps, and then your CPU would start to bottleneck (not only is cpu bottlneck unrealistic, but unless you paid well over $1000 for a GPU, it isn't even that easy to do).

But because you have a reasonable CPU/GPU combo - you may never even be able to see a CPU bottlneck even if you tried your hardest. Most people never ever would see a CPU bottleneck in any gaming situation -that is why I have issue with reviewers who don't give us at least a little real world gaming numbers along with their "no real scenario" bottleneck ones.

Let us see both, so a certain type of overly susceptible person doesn't go around making comments about "CPU" gaming numbers as though they actually get these numbers at their PC at home ... when they are speaking nothing other than a scenario that doesn't really exist in the real world. Its become an epidemic.

I saw a review yesterday that had 4k resolution ultra quality with the 2080ti -- every CPU got the exact same FPS, save for the r5 1600, which was down 2FPS from all th eothers that included 5.2OC 9900k. All the same. Real world. Intel would be so proud of me spreading "real world gaming" words of wisdom. :)


How many times are you going to post the same garbage about a 2080Ti and 4k in this thread? Just because you keep spamming it, doesn't make it any more true. 4k gaming is not real world thing right now. You are severely underestimating the power of marketing and stupidity. Look at Amazon's best sellers and the Steam survey. Almost no one is buying/using 4k monitors. Read any 4k monitor review comment section, and all you will see is people bitching about that 60hz refresh rate and how they could never use such a piece of junk since it would give them headaches and ruin their fighter pilot vision. Lower resolution "gaming" (the marketing works) monitors with insane refresh rates are all the rage now because so many people think they have Ted Williams level vision and REQUIRE no less than 144Hz refresh rate to consider buying it. I would be willing to bet there are more people using their 2080Ti's on 1080p high refresh gaming monitors trying to max out their benchmark framerates than using them on 4k screens. Again, looking at the steam survey there are far more people that own a 1080Ti/2080RTX/or 2080Ti than the 1.6% of users running a 4k monitor.
 
The rumors might have some validity, maybe it's a BIOS issue or something that AMD just needed some extra time to iron out, but so far I've seen mulriple reviewers noting their review sample 3900x and 3700x never actually reach the max boost in any realistic setup.

Either BIOS or limitation of the process node. The 14nm AMD used before was a LPP/LPE design for lower power devices. Could be this 7nm is similar and not designed for very high clock rates.

This does coincide with the other rumors of AMD having problems with getting Navi to clock as high as they wanted on this process tech. I wouldn't be surprised if TSMCs 7nm and a mix of AMDs uArch design is limiting clock speeds.
 
too bad there wasn't a 3800X included, would be interesting to see if the price tag for the 3800X over the 3700X is indeed worth it
Second this. No sites whatsoever are reporting 3800X benchmarks but it seems to be in stock and available to buy (at least it is here in New Zealand). When can we expect a review for this chip, which seems essentially identical to the 3700X save for slightly higher clocks and a significantly higher price?
 
Last edited:
How many times are you going to post the same garbage about a 2080Ti and 4k in this thread? Just because you keep spamming it, doesn't make it any more true. 4k gaming is not real world thing right now. You are severely underestimating the power of marketing and stupidity. Look at Amazon's best sellers and the Steam survey. Almost no one is buying/using 4k monitors. Read any 4k monitor review comment section, and all you will see is people bitching about that 60hz refresh rate and how they could never use such a piece of junk since it would give them headaches and ruin their fighter pilot vision. Lower resolution "gaming" (the marketing works) monitors with insane refresh rates are all the rage now because so many people think they have Ted Williams level vision and REQUIRE no less than 144Hz refresh rate to consider buying it. I would be willing to bet there are more people using their 2080Ti's on 1080p high refresh gaming monitors trying to max out their benchmark framerates than using them on 4k screens. Again, looking at the steam survey there are far more people that own a 1080Ti/2080RTX/or 2080Ti than the 1.6% of users running a 4k monitor.

Bottleneck your CPU for gaming then if that makes sense to you ... I couldn't care less.

Edit: I'll be polite and clarify for you. This has nothing to with 4K - you are reading things from my post that is not at all the point I am making. It has only to do with whether or not the CPU is bottlenecked. 99.999% of real life gaming use cases is not reflected anywhere in the graphs and numbers we see when reviewers post these numbers.

If you play at 720p on a 1050ti and a modern i5 - you still won't be bottlenecking the CPU even at 720p - so that would represent a real world scenario - and its not 4k.

4k or 1440p and maybe even a few 1080p "ultra" quality settings are suitable for nice gaming with good frames with a 2070/2070s/2080/2080ti - that would be typical and wouldn't bottleneck the CPU.

I cringe at the thought of someone buying a 2080ti for $1200, running it at 1080 or 720 with all quality settings on "low". What a waste of a good and expensive video card.
 
Last edited:
NO I think the engineers have been working for 4+ years trying to get 10 nm straightened out. And you could very well see a 15% increase in IPC, but also see a reduced clock speed. AMD already has the edge in IPC btw. As for my lack of understanding, which is laughable coming from you, the point is we are hitting a Clock speed wall, and as such, an architecture that reallies on clock speed alone to ramp performance is at a dead end. There is a reason why Intel has a negligible lead in some game benchmarks, that is because they utilize a single core 5.1 boost clock. Single core. Let me repeat that again....Single core. What does that tell you when a game app performs better because on a processor that has one core boosted significantly above the rest?

And for God's name, why would AMD choose to approach an upgrade to their architecture in this manner? Who is going to invest $1200 on a GPU only to play at 1080P? Why would anyone in their right mind spend the extra $300 for a water cooling setup that the Intel systems needs to reach those levels on settings that they will never play, when they could use that money and spend it on better faster storage or a better GPU?

You don’t Need a $300 water cooler a beefy air cooler would do just fine...
 
Bottleneck your CPU for gaming then if that makes sense to you ... I couldn't care less.
Guessing people are using them for 1440p more than 1080 though.

Don’t get the fuss with 144Hz tbf, gives you a slight advantage sure but unless you’re making money off it and are just doing it for fun I’d rather have a good IPS that can do more than just game. Like look great watching movies for example.
 
  • Like
Reactions: Soaptrail
Guessing people are using them for 1440p more than 1080 though.

Don’t get the fuss with 144Hz tbf, gives you a slight advantage sure but unless you’re making money off it and are just doing it for fun I’d rather have a good IPS that can do more than just game. Like look great watching movies for example.

The 2080ti according to the rather "not properly representative" steam survey has less than half of one percent of share with the 2080 just above half of one percent.

Its not about 4K at all - that other guy was reading into things I didn't say. Its about whether your CPU is bottlenecked or not. The fact that its almost impossible to CPU bottlneck a mid range card at any resolution (as another poster above confirmed), and that +75% of all cards on the "steam survey" (likely much more in real life) are midrange cards, means that no one is playing with a bottlnecked CPU. Also 4k represent 1.5% of the displays 3X the number of 2080ti ... lol.

Some people just plain don't get this though, and its precisely the problem I am talking about - KingGremlin proved my point that this perception problem is a real thing. People now somehow believe that since reviewers induce CPU bottlenecks that that's how it is when everyone plays games. Steam survey, as flawed as the data is, shows the exact opposite. No one is playing with a bottlnecked CPU. Why would you waste GPU resources to do that?
 
Last edited:
The 2080ti according to the rather "not properly representative" steam survey has less than half of one percent of share with the 2080 just above half of one percent.

Its not about 4K at all - that other guy was reading into things I didn't say. Its about whether your CPU is bottlenecked or not. The fact that its almost impossible to CPU bottlneck a mid range card at any resolution (as another poster above confirmed), and that +75% of all cards on the "steam survey" (likely much more in real life) are midrange cards, means that no one is playing with a bottlnecked CPU. Also 4k represent 1.5% of the displays 3X the number of 2080ti ... lol.

Some people just plain don't get this though, and its precisely the problem I am talking about - KingGremlin proved my point that this perception problem is a real thing. People now somehow believe that since reviewers induce CPU bottlenecks that that's how it is when everyone plays games. Steam survey, as flawed as the data is, shows the exact opposite. No one is playing with a bottlnecked CPU. Why would you waste GPU resources to do that?

Do get it, I didn’t bother waiting for Zen 3000 cos I could buy a 2600 for £120 slot it into my system that’s been down for the better part of a year (been away) with a Vega 56 that cost over 100 less than a 2060 and I can play everything I own at 1080p60 maxed out. The games I play a lot probably at 1440p well over 60 when I upgrade monitor later in the year. And I’ll probably need a GPU upgrade long before the 2600 becomes an issue. Probably will need a board swap too at that point.

Unless you’re gaming on a massive monitor or TV I don’t get the point in 4K over 1440p tbh. A massive performance hit just isn’t worth it.
 
  • Like
Reactions: joeblowsmynose
Do get it, I didn’t bother waiting for Zen 3000 cos I could buy a 2600 for £120 slot it into my system that’s been down for the better part of a year (been away) with a Vega 56 that cost over 100 less than a 2060 and I can play everything I own at 1080p60 maxed out. The games I play a lot probably at 1440p well over 60 when I upgrade monitor later in the year. And I’ll probably need a GPU upgrade long before the 2600 becomes an issue. Probably will need a board swap too at that point.

Unless you’re gaming on a massive monitor or TV I don’t get the point in 4K over 1440p tbh. A massive performance hit just isn’t worth it.
I kind of regret it myself - I have ultra-wide 4K (3840x1600, so about 75% of 4K), and I imagine my GTX 1080 would struggle with it in a lot of modern games if maxed out.

That said, I believed I needed the width for work purposes, because at the office, I have dual 1920x1080 monitors, so I was trying to match it horizontally with my single monitor at home.

I probably should've gone with 3440x1440, though - it would still give me ultrawide aspect ratio, but only losing about 11% of the horizontal resolution, which probably would've been comfortable enough for my work needs.
 
Intels use of after market cooler make them look good while totally losing price advantage. The 9700 ends up being the price counter part to the 3900x after you tack on the $100 cooler. The 9900 will have to compete against the 16 core 3950x.
 
Intels use of after market cooler make them look good while totally losing price advantage. The 9700 ends up being the price counter part to the 3900x after you tack on the $100 cooler. The 9900 will have to compete against the 16 core 3950x.

Its not Intels use. They just determined that no one who buys a K series CPU will use the stock cooler. Even AMDs stock cooler is barely enough to manually overclock the 3000 series, anyone overclocking will buy an after market cooler.
 
Its not Intels use. They just determined that no one who buys a K series CPU will use the stock cooler. Even AMDs stock cooler is barely enough to manually overclock the 3000 series, anyone overclocking will buy an after market cooler.
But who will be overclocking these cpus? The gains are so small and (in my mind) not worth the increased power draw and heat output. For stock speeds the wraith coolers are fine.
 
  • Like
Reactions: TJ Hooker
Rumor has it, AMD didn't sample the 3800X because is because it can't realistically hit the advertised boost, and essentially gets identical performance to 3700X - both at stock and max OC.
How can they hit 4.6GHz on the 12 core 3900X but not 4.5GHz on the 8 core 3800X unless they are only using one chiplet to get the 3800X but even still that would be less heat than the 3900X.
 
How can they hit 4.6GHz on the 12 core 3900X but not 4.5GHz on the 8 core 3800X unless they are only using one chiplet to get the 3800X but even still that would be less heat than the 3900X.
The thing is they aren't hitting 4.6 GHz on the 3900X. But even if they were, that's only on lightly threaded loads, so whether the CPU has 8 or 12 cores doesn't really matter in that situation. And also, as said above, binning.

Also, yes, the 3800X only has one compute chiplet.
 
But who will be overclocking these cpus? The gains are so small and (in my mind) not worth the increased power draw and heat output. For stock speeds the wraith coolers are fine.

I can't say but a lot of people do overclock them. I think its pointless for Ryzen 3 especially since it kills boost clocks which can boost single threaded applications higher than a manual overclock.

But still anyone who decides to overclock would not use a stock heatsink.
 
The 2080ti according to the rather "not properly representative" steam survey has less than half of one percent of share with the 2080 just above half of one percent.

Its not about 4K at all - that other guy was reading into things I didn't say. Its about whether your CPU is bottlenecked or not. The fact that its almost impossible to CPU bottlneck a mid range card at any resolution (as another poster above confirmed), and that +75% of all cards on the "steam survey" (likely much more in real life) are midrange cards, means that no one is playing with a bottlnecked CPU. Also 4k represent 1.5% of the displays 3X the number of 2080ti ... lol.

Some people just plain don't get this though, and its precisely the problem I am talking about - KingGremlin proved my point that this perception problem is a real thing. People now somehow believe that since reviewers induce CPU bottlenecks that that's how it is when everyone plays games. Steam survey, as flawed as the data is, shows the exact opposite. No one is playing with a bottlnecked CPU. Why would you waste GPU resources to do that?

Techspot's review of the Ryzen 3600 (non X) shows even this CPU is a solid performer at 1440p, i am blown away at how much value this $200 CPU has and a case could be made for the vast majority should not spend more and would be more than fine with this mighty 6 core CPU.

https://www.techspot.com/review/1871-amd-ryzen-3600/
 
  • Like
Reactions: joeblowsmynose
Techspot's review of the Ryzen 3600 (non X) shows even this CPU is a solid performer at 1440p, i am blown away at how much value this $200 CPU has and a case could be made for the vast majority should not spend more and would be more than fine with this mighty 6 core CPU.

https://www.techspot.com/review/1871-amd-ryzen-3600/

Yeah the 3600 looks to be in an amazing sweet spot. No need for i5 under any circumstance now. 9900K still has its specific use case for a few though, but that's about it until/if Intel cuts pricing.

If I didn't need the cores of the 3900x, I'd opt for that 3600, for sure.
 
  • Like
Reactions: Soaptrail
I would agree, but now the 9900K is no longer relevant for content/production (maybe if you only use Adobe and Q-sync) -- $500 is a bit of a hard sell on a purely gaming chip for most ... but there is that niche for those that can afford to not care about cost.

Same as those who would spend stupid amounts on anything.

However the 9900K was never geared towards content creation. Intel has always had a pretty big divide in that and always pushed the HEDT platform for creators/workstation class systems with mainstream, and especially the K series, being more gamer focused.

I still am confused with AMDs decision on core count. I feel like the 3950X will just eat into HEDT sales. Especially since 32 core and up will not benefit most consumers even workstation class.