News AMD Unveils 5 Third-Gen Ryzen CPUs, Including 12-Core Flagship

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
1month 10days and we will get to know how good Ryzen really is when compared to Intel or was it just false marketing from AMD. In anyway we will have to wait a bit. Same goes for Radeon Navi and specially as there is no prices officially announced by AMD we can only expect them to be good valued cards no matter how they perform.
 
I see that, but IMO, it should be on a by application basis. I get that it's cheaper, but how many people REALLY need 12 cores?
To get a 3900x and not use most of, or at least 70% of it's available resources is still wasting money, even if it's cheaper.
That's how I see it, anyways.
 
How hard is it to get the naming right, though?

The Ryzen 5 3600 is perfect, it's a 6 core part, then the X is for high TDP, and non-X is low TDP, exactly as last gen was.

But then there's the Ryzen 7 3700X, which is low TDP, and the 3800X, which is high TDP, both with 8 cores. Why not 3700 X and non-X?

That leaves the 3900 to encompass 12 and 16 cores, and now the name has to be broken with a 50! Come on, it's such a basic mistake. 3800 for 12 cores and 3900 for 16 would make everything so much easier.
 
  • Like
Reactions: JQB45
I see that, but IMO, it should be on a by application basis. I get that it's cheaper, but how many people REALLY need 12 cores?
To get a 3900x and not use most of, or at least 70% of it's available resources is still wasting money, even if it's cheaper.
That's how I see it, anyways.
That's one way to see it, and it's not wrong. Comparing apples to apples makes sense, like 8 to 8 cores, to see who jas the best cpu from an objective point.

But when you go out to buy one, the budget is an important factor. If you have $400 to buy a CPU, would you compare a $400 Intel to a $400 AMD? Or a $400 Intel to a $300 AMD? Why wont you step up to a stronger and better CPU?

Even if you don't need it today, 12 cores is more future proof than 8. Or, you could use the difference to get a better GPU or more RAM or SSD or cooling. From that point of view, the price is what you use to compare, and it's not wrong either.
 
  • Like
Reactions: JQB45
the 3700 is right about what i was hoping for and expecting. 12 threads should be enough for me coming from a 4 core i5. $200 for the cpu, another $100 for decent mobo and ~$80 or so for some fast 16 gb ddr4 ram and that's a pretty cheap system upgrade.

eFfmdb3.jpg
 
Did not Intel also release it 10nm parts, we need to compare 7nm to 10nm parts to be fair(er).

The problem with Intel's 10nm part is they are only good for low power applications. Transistor speed on 10nm wasn't scaling at all. The best you could do is reduce a clock cycle or two on services that have long trace paths. Yields, for a long time, were in the single digits. (Unsustainable) So they made the dies smaller to get higher yields. Thats one reason the first versions didn't have graphics.

Intel tried to solve the problem with Cobalt. But there was an issue with voids that lead to open shorts so they couldn't use it on all but the biggest traces. So they were isolated to the bottom of the stack.

10nm is dead for anything performance related.

This is a good lesson about how you NEVER sit on your laurels, and always have something in the hopper for emergencies (even if you are on top.)
 
I see that, but IMO, it should be on a by application basis. I get that it's cheaper, but how many people REALLY need 12 cores?
To get a 3900x and not use most of, or at least 70% of it's available resources is still wasting money, even if it's cheaper.
That's how I see it, anyways.

"How many people really need 8 logical cores?" -Me 6 years ago when I purchased an i7.

Truth is few games will use beyond 8 today. A few will (Civ). But there are more purposes to more cores. People like to multi task while gaming now a days (streaming Twitch/Plex), downloads, etc. But more cores is also more future proof. It goes along with the same idiom of "640k should be enough for anyone." As speed increases are small these days, dependence on more threaded task will grow.
 
  • Like
Reactions: King_V and rigg42
I see that, but IMO, it should be on a by application basis. I get that it's cheaper, but how many people REALLY need 12 cores?
To get a 3900x and not use most of, or at least 70% of it's available resources is still wasting money, even if it's cheaper.
That's how I see it, anyways.

You may call those 4 extra cores a waste of money but i call it free real estate. 😉
 
  • Like
Reactions: NightHawkRMX
In the high segment, the prices are fine, they are really competitive with the Intel. In the mid-high range, the prices are the same as Intel's and I don't care about the number of cores, the clock speed is not on par with Intel.
Keep in mind that AMD is claiming a 15% IPC increase over existing Ryzen CPUs in addition to the higher clocks. So a 3600X boosting to 4.4 Ghz may perform more like how a 2600X would if it could boost to over 5 GHz, which 12nm Ryzen isn't capable of. So between those higher clocks and the increased IPC, we might be looking at around a 20% performance improvement over the 2000-series Ryzens. Of course, those clock rates are likely for single core boost, and we won't know exactly what multi-core boost clocks will be like until reviews are available.

There's also the matter of overclocking though, as it's unknown how much binning may play a role with these new 7nm processors. If the $199 3600 were to manage an all-core overclock near 4.5 GHz, it could potentially outperform not only Intel's similarly priced offerings, but also their locked i7s priced around a hundred dollars more, since it will offer the same number of cores and threads as an i7-8700. Will the 3600 outperform an overclocked i7-8700K? Probably not, but those unlocked i7s are also around double the price. Even an overclocked 9600K might offer slightly better per-core performance, but you lose SMT, and the price is still going to be substantially higher when a capable cooler is figured in.

That leaves the 3900 to encompass 12 and 16 cores, and now the name has to be broken with a 50! Come on, it's such a basic mistake. 3800 for 12 cores and 3900 for 16 would make everything so much easier.
But first generation Ryzen already established the 1800X as an 8-core part. Meanwhile, Threadripper has been using numbers like 2920X, 2950X, 2970WX and 2990WX for higher core count processors. So using a similar naming scheme for 12 and 16 core AM4 processors might make sense. And we don't even know for sure that AMD will release a 16 core AM4 processor this generation. They could potentially leave that core count to Threadripper for the time being, and hold off until next year to bring 16 cores to AM4.
 
  • Like
Reactions: King_V and rigg42
But first generation Ryzen already established the 1800X as an 8-core part. Meanwhile, Threadripper has been using numbers like 2920X, 2950X, 2970WX and 2990WX for higher core count processors. So using a similar naming scheme for 12 and 16 core AM4 processors might make sense. And we don't even know for sure that AMD will release a 16 core AM4 processor this generation. They could potentially leave that core count to Threadripper for the time being, and hold off until next year to bring 16 cores to AM4.

The tech community has seen engineering samples of the 16c/32t part and benchmarks, I'm guessing AMD is holding back for now, perhaps to build up inventory of suitable chiplets.
 
And we don't even know for sure that AMD will release a 16 core AM4 processor this generation. They could potentially leave that core count to Threadripper for the time being, and hold off until next year to bring 16 cores to AM4.
Release of 16C/32T processor is 100% confirmed otherwise board partners wouldn't have gone through that much pain to create boards suitable to push 300W through it for the CPU.
 
In the high segment, the prices are fine, they are really competitive with the Intel. In the mid-high range, the prices are the same as Intel's and I don't care about the number of cores, the clock speed is not on par with Intel.

A 15% increase in IPC means high clocks aren't needed to exceed Intel in single threaded and multithreaded performance. Did you read the article?

Single core performance better than 9700k and 9900k (if we believe the slides, but we have no reason not to, AMD has been stone accurate with its Ryzen series announcments, or under announce if anything).

This will translate into equal or better gaming - the improvements over Ryzen 2xxx in cpu bottlenecked benching would likely put it past Intel's offering in gaming as well. But who the hell bottlenecks their CPU anyway? lol ...

So yeah maybe Intel still has 5.0 ghz, and nuclear furnace type of power consumption like AMDs old Bulldozer FX9590, but it looks like that's Intel fanbois last "bragging right" kicked to the curb ... (if a lesser performing but higher clocked nuclear furnace is even worth bragging about ... I wouldn't be) -- the joke of a CPU we all know as the mighty 5.0 ghz Bulldozer FX9590 also had 5.0 ghz, and was also outperformed by its competitor with much lower clocks - you didn't hear anyone bitchin about Intel's low clock speeds then did you?

There's much more to a CPUs performance than the clock speed. As AMD just showed.
 
Last edited:
I see that, but IMO, it should be on a by application basis. I get that it's cheaper, but how many people REALLY need 12 cores?
To get a 3900x and not use most of, or at least 70% of it's available resources is still wasting money, even if it's cheaper.
That's how I see it, anyways.


Looks like you really had to dig to try to find a negative ... "What's this you say, you want to upgrade me to a free turbocharger to this car I was going to buy? Sorry, no thanks, there is a speed limit so who REALLY needs free turbo upgrades anyway? Sorry I'll just buy that more expensive Intel car that doesn't have a turbo"

also, I am a power user, who does 3D animation and some video effects/production and hobby art. Since Ryzen launched, Intel making you pay through the nose for more cores, hasn't even been an option for me (but I'm gald AMD forced their pricing down to earthly levels again).

I'm certainly not the only guy out there in this boat that are eagerly awaiting the 12 core desktop chip.

If all you need is maybe four cores, then buy a four core processor instead of complaining about the great price on the 12 core one. :)
 
Yea I think Linus's video this morning about X570 was pretty nice, showing off some MSI and ASUS boards.
Sounds like MSI only recommends its high-end X570 boards for power-hungry CPUs (like 300w 16 core). Asus says all of their X570 boards are capable of supporting high-end CPUs.

If they do announce a cpu that consumes 300w when overclocked, it sounds like another FX 9590 disaster where after a couple of years, most boards cant supply the required power.
The 9700x seems nice with incredible performance for only 65w.
 
ASRock has announced their existing motherboard line up will support Ryzen 3000 but they do not specifically state at this time what motherboards will support or not support which 3000 series CPUs.

New BIOS Updates To Support AMD Ryzen 3000 Series Processors For ASRock AM4 Series Motherboards

TAIPEI, Taiwan, May 6th, 2019 – ASRock has announced BIOS updates for AMD X470/B450/X370/B350 and A320 series motherboards to support the soon to be released AMD Ryzen 3000 Series processors. The latest BIOS update will be available for download from our website or simply update through ASRock APP Shop.


Source: asrock.com/news


News from ASRock about their ASRock AMD X570 Chipset Motherboards and COMPUTEX 2019.

More ASRock News
 
Last edited:
I saw the video from Computech ,my first reaction was what is Intel going to do now?Same with Nvida and the 1100 US dollar videocards
AMD cpu that is even or beats the 1200 dollar Intel cpu at half price?lay down and play death? This competition is a gain for the consumer(finaly)
Pci-e 4 is the first step according to internet Pci-e 5 is not to far behind,its a day to celebrate.
 
Last edited:
I saw the video from Computech ,my first reaction was what is Intel going to do now?Same with Nvida and the 1100 US dollar videocards
How is AMD beating a Nvidia $1100 gpu, presumably the RTX 2080ti? The RX 5700 is comparable to the RTX 2070. The only place that the RX 5700 beat the RTX 2080ti was in memory bandwidth, which doesn't necessarily translate to better performance (I'm getting this from the Gamers Nexus video starting at 8:47).