AMD's Future Chips & SoC's: News, Info & Rumours.

Page 109 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
If Global Foundries refuses to upgrade their node they will soon become obsolete in a few years

Nope. As others have alluded, CPU's are just a portion of the market. There are plenty of uses for low power IC's from sensors, specialty chips, ASICs, low cost ARM (routers), LP Bluetooth, GPS processors (not transceivers), and more. 14 nm is still a very powerful node. A 14nm Rasp Pi/Arduino? Yes please! I could run an Arduino in low power mode for months at that node size using a couple four AAs', They just need to deliver a cost that is competitive with competing foundries.
 

jdwii

Splendid
I'm a bit annoyed at the Boost clocks shenanigans from AMD... I know it's really hard to tell which cores will be hitting those speeds, but the disparity is annoyingly bad.

Oh welp, let's see how this improves.

Cheers!


You and me both i made several emails to Amd and Asrock and gave them several results and Asrock told me the whole time it was Amd, while Amd was completely quiet up to now. I personally talked to Asus and Gigabyte engineers on top of Asrock Engineers they all told me it was Amd's fault for the boost issue.


I'm happy Amd is fixing this issue, both Gigabyte and Asus engineers told me Amd wasn't going to fix this issue and i'm so proud of the community saying this isn't ok. If they print a frequency on the box it needs to hit that even on their provided stock heatsink if not its highly misleading.


I tried everything.............Turning off SMT...............Having this on a H150i.........So on.

So happy Amd is fixing this!
 

jdwii

Splendid
Nope. As others have alluded, CPU's are just a portion of the market. There are plenty of uses for low power IC's from sensors, specialty chips, ASICs, low cost ARM (routers), LP Bluetooth, GPS processors (not transceivers), and more. 14 nm is still a very powerful node. A 14nm Rasp Pi/Arduino? Yes please! I could run an Arduino in low power mode for months at that node size using a couple four AAs', They just need to deliver a cost that is competitive with competing foundries.


Either way they will be faded out of history i will keep this message and remember it years from now.

Global Foundries basically ended their career i mean if TSMC gave 5% of their manufacturing to what Global Foundries was doing they could take their business away and do so with a promise for the future
 
Probably won't any time soon, all the REALLY good chiplets are going to EPYC/TR3.

A few weeks ago, I came across a post from someone who claimed to have gone through a few 3900X core-by-core and found out that one chiplet on every CPU he tried seemed to be 3600X-class, which seems to imply that the 3900X = ok-ish chiplet to deliver the core count + a better one to (hopefully) deliver marketing clocks on fewer cores. If this is true, I'm smelling another round of class actions in AMD's future for not clearly disclosing that the two CPU chiplets it is made of aren't equal.

That's a reasonable assumption and it is, most probably, the explanation for this round of annoyances. If they're just passing "mediocre" cores to the Ryzen line and leaving the "top crop" on servers, then the process is still immature (to call it something), so that makes me believe we'll get a revision B at some point... Anyone with good TSMC information about it? :D

You and me both i made several emails to Amd and Asrock and gave them several results and Asrock told me the whole time it was Amd, while Amd was completely quiet up to now. I personally talked to Asus and Gigabyte engineers on top of Asrock Engineers they all told me it was Amd's fault for the boost issue.


I'm happy Amd is fixing this issue, both Gigabyte and Asus engineers told me Amd wasn't going to fix this issue and i'm so proud of the community saying this isn't ok. If they print a frequency on the box it needs to hit that even on their provided stock heatsink if not its highly misleading.


I tried everything.............Turning off SMT...............Having this on a H150i.........So on.

So happy Amd is fixing this!
Well, that is true. They did come out and cleared the waters before the camel's back broke. It's still better than keeping quiet forever and saying "everything is fine".

Cheers!
 
A few weeks ago, I came across a post from someone who claimed to have gone through a few 3900X core-by-core and found out that one chiplet on every CPU he tried seemed to be 3600X-class, which seems to imply that the 3900X = ok-ish chiplet to deliver the core count + a better one to (hopefully) deliver marketing clocks on fewer cores. If this is true, I'm smelling another round of class actions in AMD's future for not clearly disclosing that the two CPU chiplets it is made of aren't equal.
Its pretty crappy if that is indeed what AMD did, but is that enough legal ground for a class action to come into play? I'm just thinking that ARM has high performance and low performance cores in big.LITTLE configurations and that might set some precedent for still being able to advertise 8 cores despite vastly different capabilities between high performance cores and power efficiency cores. Obviously ARM processors compared to AMD x86 is an apples to oranges comparison but does it have any legal ground to prevent class action?
 

InvalidError

Titan
Moderator
Its pretty crappy if that is indeed what AMD did, but is that enough legal ground for a class action to come into play? I'm just thinking that ARM has high performance and low performance cores in big.LITTLE configurations and that might set some precedent for still being able to advertise 8 cores despite vastly different capabilities between high performance cores and power efficiency cores.
ARM SoC designers explicitly market their SoCs as being a big.LITTLE setup which uses two completely different core designs between the two clusters while AMD made no mention that I am aware of that it is pairing a higher bin chiplet with a lower bin one in the 3900X, which is why people have set out to determine whether this is the case by testing their chips core by core and the few results I've seen so far seem to be pointing that way.
 

InvalidError

Titan
Moderator
I dont see a lawsuit coming as the rated boost is 1 core, so it doesn't matter if some of the cores are lower quality.
At more than double the price for less than 50% more silicon under the IHS, I think it would be perfectly fair to expect two similar higher-quality dies. I imagine this is going to cause a considerable amount of friction when more OCers confirm that the reason they cannot achieve all-cores overclocks much above 4.2GHz is because one chiplet is clearly inferior to the other.
 

jdwii

Splendid
I dont see a lawsuit coming as the rated boost is 1 core, so it doesn't matter if some of the cores are lower quality.

Also i havent heard amd promise of high binned chips.

Yeah Amd will win this one as long as they actually patch out their 3000 series so they boost to the rated turbo.

On my chip 3 of my 8 cores go to 4375mhz(guessing 4.4ghz once the patch comes). Core 0 is the worst core that i have it only reaches 4.3ghz but the rest reach 4325mhz.

Not a big deal at all to me as i already knew under all core loads it wasn't going to reach 4.4ghz and in single core loads W10 1903 is supposed to use the fastest cores first anyways.

Legally I think they will be fine.
 
  • Like
Reactions: NightHawkRMX

DMAN999

Dignified
Ambassador
Yeah Amd will win this one as long as they actually patch out their 3000 series so they boost to the rated turbo.

On my chip 3 of my 8 cores go to 4375mhz(guessing 4.4ghz once the patch comes). Core 0 is the worst core that i have it only reaches 4.3ghz but the rest reach 4325mhz.

Not a big deal at all to me as i already knew under all core loads it wasn't going to reach 4.4ghz and in single core loads W10 1903 is supposed to use the fastest cores first anyways.

Legally I think they will be fine.
On my 3700x, I have seen 3 cores get to 4.375 Ghz and 4 got to 4.350 Ghz and one got to 4.325 Ghz.
I personally am fine with that.
 
4.375 is basically what AMD rates that chip for.

The stock cooler may not allow for these clocks as frequently as the freezer 33 eSports.

Still, the I9 9900 will never reach rated boost speeds on the stock cooler and nobody is talking about them not reaching boost speeds.
In all fairness to Intel there, that's why they don't ship them with cooling solutions anymore and just rate the TDP at base clocks.

As every single reviewer will use the best they can get to cool stuff, they won't use cheap stuff (be it AIOs or 95W rated HSFs).

And they can hit the clocks, it's just a LOT of power they suck to do so. AMD just... doesn't hit them reliably under any circumstance, from what I read.

Cheers!
 

jdwii

Splendid
Just give it time i was starting to worry when Asrock, Asus, and Gigabyte engineers told me and others that Amd wasn't going to fix the turbo issue that's when i was starting to get angry as some owners aren't even seeing 4.4ghz on their 3900X!
 

rigg42

Respectable
Oct 17, 2018
639
233
2,390
At more than double the price for less than 50% more silicon under the IHS, I think it would be perfectly fair to expect two similar higher-quality dies. I imagine this is going to cause a considerable amount of friction when more OCers confirm that the reason they cannot achieve all-cores overclocks much above 4.2GHz is because one chiplet is clearly inferior to the other.
I'm not sure how much can be drawn from my small sample size but here goes...... I currently own 3 3600's and a 3900x. I've also had a 3700x and 3600x which i returned to Micro Center. I OC'ed and stability tested all of the CPU's. There is truth to the one good chiplet and one meh chiplet on my 3900x. However, among my samples, the "meh" 3900x chiplet is a better overclocker than every other 6 core chiplet I've had. All of my 6 core chiplets are also better (in terms of OC potential) than the 3700x chiplet I had. The good 3900x chiplet is stable at 4.5ghz (CCX0) 4.4ghz (CCX1) at 1.325v (SV12 full load). The Meh chiplet has both CCX's stable at 4.3 ghz at 1.325v (SV12 full load). The 3600x hit 4.3ghz (CCX0) 4.2ghz (CCX1) stable at 1.325v (SV12 full load). All 3600's CCX's are in the 4.1 - 4.2 range at this voltage. So at least with my small sample size of 1 3900x and 1 3600x , both 3900x chiplets are superior.

Anyone doing an all core overclock on a 3900x doesn't know the best way to overclock a 3900x. You have 4 ccx's you can overclock individually. Why would you be want to be held down by your worst CCX(s)? Sure ryzen master isn't an ideal way to overclock but it's currently the best way to maximize the performance of a 3900x IMO. I'm praying they add this feature at the bios level sooner than later.
 
Last edited:

rigg42

Respectable
Oct 17, 2018
639
233
2,390
I just did some video encoding and my 3700x was hitting 4.25 GHZ to 4.275 GHz on all cores.
I'm pretty happy with it.
My encoding time was about half of what it was with my OC'd 2600.
4275mhz on all cores is actually faster than id expect.
Yeah that’s damn good on auto. You must have a pretty good cooling setup. If you’ve got the patience, per CCX oc in ryzen master is worth playing with. I think setting a fixed voltage and adding some LLC in bios is the best way to go.

In all honesty if your all core on auto is that good your single core is probably pretty dang good too. You might not get the massive performance uptick I did from per CCX oc. I managed to improve both over stock. The 3900x is a different beast with 4x CCX’s to play with. I didn’t get as much performance gain on my single chiplet CPU’s.
 
  • Like
Reactions: DMAN999
Intel says only 0.21% of laptop and 2 in 1 users use Maxon 4d software so Cinebench is not representative of real world performance when comparing Intel and AMD desktop processors.

Umm, what Intel is really saying is we have no way to counter AMD right now, so were going to release misleading data non representative of reality to try to boost Intels reputation.

So whos the one not being not representative of reality?

AMD for using benchmarking software Intel claims nobody uses or Intel for basically lying about how many people actually use that software?

View: https://youtu.be/KibihZ1DkhI
 

jdwii

Splendid
Intel says only 0.21% of laptop and 2 in 1 users use Maxon 4d software so Cinebench is not representative of real world performance when comparing Intel and AMD desktop processors.

Umm, what Intel is really saying is we have no way to counter AMD right now, so were going to release misleading data non representative of reality to try to boost Intels reputation.

So whos the one not being not representative of reality?

AMD for using benchmarking software Intel claims nobody uses or Intel for basically lying about how many people actually use that software?

View: https://youtu.be/KibihZ1DkhI

I'll take that over them paying off devs to make Amd look bad. Not to mention what about like every other piece of software that shows the 3900X beating a 9900K? Why is it that Intel talks so much about gaming performance all of sudden?

Seems a bit odd to me.
 

DMAN999

Dignified
Ambassador
Yeah that’s damn good on auto. You must have a pretty good cooling setup. If you’ve got the patience, per CCX oc in ryzen master is worth playing with. I think setting a fixed voltage and adding some LLC in bios is the best way to go.

In all honesty if your all core on auto is that good your single core is probably pretty dang good too. You might not get the massive performance uptick I did from per CCX oc. I managed to improve both over stock. The 3900x is a different beast with 4x CCX’s to play with. I didn’t get as much performance gain on my single chiplet CPU’s.
Yes, my PC is pretty well cooled, I have very good airflow and I keep the room temperature at about 22-23c.
I will definitely be looking into using RM to do a per CCX OC eventually but I just got this chip and it is performing really well, so I am just going to enjoy it set on Auto for now.
And like you said hopefully the options to OC each CCX individually will hopefully be added to my BIOS at some point.
 
Last edited:
  • Like
Reactions: rigg42
I'll take that over them paying off devs to make Amd look bad
Both Intel and AMD are no stranger to inflating and misrepresenting CPU performance.

Just think of the intel pentium 4 lawsuit when AMD had a better product. Intel paid companies to make the pentium 4 superior to amd in software and make benchmarks designed to hide intels design flaws.

Or the more recent AMD FX lawsuit over the amount of cores the CPU has.

Now for some speculation.

I suspect some, not all of the Windows issues with Ryzen may have to do with a simmilar scenario.

Sounds far fetched, but intel and Microsoft were once tightly bound. There are a lot of issues with amd cpus in windows causing crippled performance. Microsoft doesnt seem to fix them even though some of the solutions seem simple.