News Puget says its Intel chips failures are lower than Ryzen failures — retailer releases failure rate data, cites conservative power settings

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
See? Now you're completely derailed from the Adobe benchmarks.

I'd suggest trying to stay focused on one question/issue without opening others (to the extent possible).
Why aren't you suggesting that to the guy saying that the 7950x at 65w beats the 13900k at 125w. The graph you yourself keep posting absolutely falsifies this. Im not derailing, im responding to what's posted.

Did you even look at the link? Intel leads in Blender, AMD leads in x264, the rest were essentialy a wash.
Ok man, pulling 65w it beats the 13900k at 125w, whatever. Facts ain't important, I get it.
 
Why aren't you suggesting that to the guy saying that the 7950x at 65w beats the 13900k at 125w.
From what I saw, it looked like you were introducing the efficiency tangent. If I'm wrong about that, sorry. Regardless, my advice is intended for all.

The graph you yourself keep posting absolutely falsifies this.
Yeah, I put more faith in the ComputerBase data, for a number of reasons. Still, it's a composite and I haven't checked to see if there are one or more cases where that 65W contention might hold, but I'd agree that it doesn't generally hold.

Im not derailing, im responding to what's posted.
Okay, so without casting blame, let me just point out that sometimes there are options about how to answer a question that differ in their potential to derail the discussion. As I said, I think citing benchmarks of the i9-14900K at stock settings was one of those options.

Another observation I'd make is that not all questions are answerable by us, right now. Sometimes, we just have to accept that we don't know and hope that will change.
 
Last edited:
Why aren't you suggesting that to the guy saying that the 7950x at 65w beats the 13900k at 125w. The graph you yourself keep posting absolutely falsifies this. Im not derailing, im responding to what's posted.


Ok man, pulling 65w it beats the 13900k at 125w, whatever. Facts ain't important, I get it.

I'm not the one defending a CPU that likely degrades over time while also being less efficient, I don't know what else to say.
 
Ah gosh. Does nobody ever actually read the review? The AMD part was pulling a lot more power than what is shown on the graph. Here is a quote from the reviewer himself, page 3

Starting with the peak power figures, it's worth noting that AMD's figures can be wide off the mark even when restricting the Package Power Tracking (PPT) in the firmware. For example, restricting the socket and 7950X to 125 W yielded a measured power consumption that was still a whopping 33% higher
 
Ah gosh. Does nobody ever actually read the review? The AMD part was pulling a lot more power than what is shown on the graph. Here is a quote from the reviewer himself, page 3

Starting with the peak power figures, it's worth noting that AMD's figures can be wide off the mark even when restricting the Package Power Tracking (PPT) in the firmware. For example, restricting the socket and 7950X to 125 W yielded a measured power consumption that was still a whopping 33% higher
Who didn't read the conclusion?

literally quoted in conclusion page

"The catch here, however, is that the AMD platform as a whole was far more lax in sticking to its programmed PPT values, as evidenced by yCruncher power consumption. Despite setting the 7950X to 65 W, we still measured 90.3 W under that workload. So on the assumption that translates to CineBench, the 7950X's power efficiency gains aren't as impressive; we're looking at 42% of the power consumption for 81.1% of the performance, or a power efficiency of 1.93x over stock.

Looking at the rest of the metrics at 65 W, we saw a peak load power of 90.3 W, and a peak core temperature of 52°C. "
Now assuming peak load power (which didn't sustain throughout the test to be called a peak) is 91W for those 65W results shown in the page, it still beats the 13900K at 105W and wins/tie the 125W ones, assuming no overshoot, but in the same conclusion page for the 13900k

"Overall, at the 65 W mark we saw a peak load power of 71.4 W, and a peak core temperature of 39°C. "

So while more lax, it's not massive overshooting peak vs no overshooting, so why isn't AMD more efficient is a mystery to me
 
I'm not the one defending a CPU that likely degrades over time while also being less efficient, I don't know what else to say.
IMO, it'd be good if we could leave efficiency out of this. It's a well-trodden subject, as I mentioned before.

Besides, there will be new opportunities to revisit efficiency when the new microcode drops, as well as during the rollout of Ryzen 9000. IMO, that's going to be a much more interesting discussion. No reason to re-fight the same old (soon-to-be-irrelevant) battles, yet again.
 
  • Like
Reactions: thestryker
You do always conveniently forget to quote the following sentence: "By comparison, the 13900K exceeded its set limits by around 14% under full load."
Your own freaking graph shows that the 7950x at 65w doesn't beat a 13900k at 125w, so why does it matter what I forgot or not forgot to include? Is your graph wrong? Does the 7950x at 65w beat the 13900k at 125w?
 
Who didn't read the conclusion?

literally quoted in conclusion page

"The catch here, however, is that the AMD platform as a whole was far more lax in sticking to its programmed PPT values, as evidenced by yCruncher power consumption. Despite setting the 7950X to 65 W, we still measured 90.3 W under that workload. So on the assumption that translates to CineBench, the 7950X's power efficiency gains aren't as impressive; we're looking at 42% of the power consumption for 81.1% of the performance, or a power efficiency of 1.93x over stock.

Looking at the rest of the metrics at 65 W, we saw a peak load power of 90.3 W, and a peak core temperature of 52°C. "
Now assuming peak load power (which didn't sustain throughout the test to be called a peak) is 91W for those 65W results shown in the page, it still beats the 13900K at 105W and wins/tie the 125W ones, assuming no overshoot, but in the same conclusion page for the 13900k

"Overall, at the 65 W mark we saw a peak load power of 71.4 W, and a peak core temperature of 39°C. "

So while more lax, it's not massive overshooting peak vs no overshooting, so why isn't AMD more efficient is a mystery to me
If it needs 90w to tie the 13900k at 125w then obviously at 65w it doesn't.
 
Your own freaking graph shows that the 7950x at 65w doesn't beat a 13900k at 125w, so why does it matter what I forgot or not forgot to include? Is your graph wrong? Does the 7950x at 65w beat the 13900k at 125w?
That graph is the peak power at the power limit settings, not performance cheif

If it needs 90w to tie the 13900k at 125w then obviously at 65w it doesn't.
the peak power means it over shoots a fraction of a second, not sustained power, and at 125W setting of 13900k, it also overshoots with peak power at 143W, and 105W of 13900k also have peak power at 118.6W.

So using all these peak power setting, and given you benefit of doubt, 7950X @ 90W beats 13900k @118.6W and ties/beat 13900k @143W it is still more efficient unless 90W is more than 118/143, so where does the lie comes for efficiency.

Hell I don't even own a freaking Zen CPU, I own a 14900k which from day 1 I know will not be power efficient as the same gne Ryzen, but I don't get the blind defensive stance
 
That graph is the peak power at the power limit settings, not performance cheif


the peak power means it over shoots a fraction of a second, not sustained power, and at 125W setting of 13900k, it also overshoots with peak power at 143W, and 105W of 13900k also have peak power at 118.6W.

So using all these peak power setting, and given you benefit of doubt, 7950X @ 90W beats 13900k @118.6W and ties/beat 13900k @143W it is still more efficient unless 90W is more than 118/143, so where does the lie comes for efficiency.

Hell I don't even own a freaking Zen CPU, I own a 14900k which from day 1 I know will not be power efficient as the same gne Ryzen, but I don't get the blind defensive stance
Who said that the 7950x isn't more efficient? I said that at 65w doesn't beat the 13900k at 125w.
 
Who said that the 7950x isn't more efficient? I said that at 65w doesn't beat the 13900k at 125w.

FYI
Ok, im giving you the benefit of the doubt for the 2nd time. I checked this

https://www.techpowerup.com/review/...ke-tested-at-power-limits-down-to-35-w/2.html


The 14900k is winning or on par with the 7950x while being limited to 125w on 25 workloads. That's excluding the adobe applications you mentioned.

I think you are lying to me chief...
 
Do I ever say the 7950x isn't more efficient? I have no clue what you are saying man, honestly.
Do you understand - both of you - that that's exactly what I said? That the 7950x is 10% more efficient but the 14900k is actually faster? What the actual heck lads...

And no, read my post again. I said at ISO power Intel chips ARE more efficient. This is not ISO power. Think about it the other way. Trying to boost the 7950x to reach the gaming performance of the 14900k will make it consume a lot more than 106w. Got it?

We don't even need to guess. Configured at 95w PL the 14900k draws 80w while still being faster than the 7950x, lol.
From another thread said by you, it was comparing gaming load, sure, it's 14900k but it was just a 13900ks renamed and more efficient than the 13900k.

But yea, in this thread you havn't say that 13900k is more efficient as a conclusion.

So we can agree now that the 7950X is more efficient than 13900k, while don't have microcode bugs and extreme VID profiles which cooks itself right?
 
From another thread said by you, it was comparing gaming load, sure, it's 14900k but it was just a 13900ks renamed and more efficient than the 13900k.

But yea, in this thread you havn't say that 13900k is more efficient as a conclusion.

So we can agree now that the 7950X is more efficient than 13900k, while don't have microcode bugs and extreme VID profiles which cooks itself right?
never in my life have I suggested that the 13900k is more efficient than the 7950x in MT workloads at iso power. The post you quoted is talking about gaming, in which case yes, at iso power the 13900k is more efficient.
 
never in my life have I suggested that the 13900k is more efficient than the 7950x in MT workloads at iso power. The post you quoted is talking about gaming, in which case yes, at iso power the 13900k is more efficient.
it doesn't matter, now, so,

In MT workload, 7950X is more efficient at reduced power

In games, RPL is faster than 7950X but slower than X3D chips

RPL have a ton more crashes in the latest and greatest UE5 engine nowadays due to the degradation through the years of bad microcode and elevated VID

Puget shows less overall percentage till now for RPL, but per month failure isn't good compared to 11th gen peak and rising, especially for 14th gen, even at much lower power settings than intel extremely and even lower than intel performance profile

I really don't see how can the higher end PRL be looking remotely good in comparison to competition, for anypoint
 
I agree with the fear mongering. In the same release by Puget they also go on to say
"The most concerning part of all of this to us here at Puget Systems is the rise in the number of failures in the field, which we haven't seen this high since 11th Gen. We're seeing ALL of these failures happen after 6 months, which means we do expect elevated failure rates to continue for the foreseeable future and possibly even after Intel issues the microcode patch.

Based on this information, we are definitely experiencing CPU failures higher than our historical average, especially"

And it is not just a niche thing. The oxidation is all encompassing. Even Intel admitted this. And yes I know so far it seems to affect diy consumers more than the general public running OEM systems. But that is because diy consumers are more apt to push the CPU a bit harder.
BUT.. And this is the main issue, These CPUs are degrading faster than normal. It will take longer for the general public to really start seeing problems. Because of the fact that they plug and play. But don't get it wrong, those CPUs are also at high risk of oxidation and degradation faster than what is normally happening.

I don't like people who are spreading misinformation on the issue. There's people who outright deny and lie to others about the facts currently and mislead people who really wouldn't know better. This "Mountain" isn't a molehill. And I tell themselves have acknowledges this and that's why they have extended their warranty period.

I also agree with you that panic is not inorder. And some calm level communication and fact sharing is the best course of action. And also, I think Intel should make all this information public on mainstream media. But just facts and what's being done to fix the issues. Not spreading misinformation and and seeing fanboys form each side trying to make the issue so much bigger or smaller than it is.

I think if Intel went mainstream with the oxidation and degradation issues and they were able to at the same time give information about possible fixes and warranty information for RMAs, that it wouldn't cause a huge backlash. Conversely, I see it helping to show the public that they admitt an issue and are giving multiple ways of an easy resolution and fix. It would back up Intel's statement that they are continuing to support their customers and will work with them to make them feel like they can still trust and appreciate their Intel based systems.

Fyi.... I take no sides in this issue. I don't geek out over AMD or Intel. I want them both to be as successful as possible. I geek out over tech in general.

Also, I also know I'm not as smart as Intel's marketing and Public relations teams. So I'm not going to pretend my ideas for public resolution are what they should do. It's just a quick idea I

Exactly, and not to mention certain people who are commonly on Intel failure based threads, use Puget Systems results to equate to the rest of the world, when infact they're a very small portion of the worlds 13-14 gen users. Also... Puget released a good article describing how their rates for Intel failures are actually on the rise and they're prepared to thoroughly test the microcode before they release it to customers. AND... they also are quoted as saying they actually DO anticipate and expect the failures to start rising (also they said it already has begun) quickly.

But of course the Intel fanboys conveniently leave that stuff out when saying Intel's degradation and failures aren't all that much of an issue.
No its simply that this topic is about Puget's reporting and the interpretation. If another topic arises reporting further statistics, despite the outcome, then we should also interpret and discuss.
 
No its simply that this topic is about Puget's reporting and the interpretation. If another topic arises reporting further statistics, despite the outcome, then we should also interpret and discuss.
Ok, and Puget Systems also has released additional information about these results. And that information states that they are preparing for a much higher failure rate. That absolutely does affect what this article is about. That's all I was saying.
 
Link? Or please at least tell us where you saw that.
Here ya go... It's pretty far down the page.

https://www.pugetsystems.com/blog/2...-perspective-on-intel-cpu-instability-issues/

Here's a copy/past of the relevant part I was quoting from the release:

"The most concerning part of all of this to us here at Puget Systems is the rise in the number of failures in the field, which we haven’t seen this high since 11th Gen. We’re seeing ALL of these failures happen after 6 months, which means we do expect elevated failure rates to continue for the foreseeable future and possibly even after Intel issues the microcode patch.

Based on this information, we are definitely experiencing CPU failures higher than our historical average, especially with 14th Gen. We have enough data to know that we don’t have an acute problem on the horizon with 13th Gen — it is more of a slow burn. We do expect an elevated failure rate on 14th Gen while Intel finishes finding a root cause and issuing a microcode update."
 
  • Like
Reactions: bit_user
Here ya go... It's pretty far down the page.

https://www.pugetsystems.com/blog/2...-perspective-on-intel-cpu-instability-issues/

Here's a copy/past of the relevant part I was quoting from the release:

"The most concerning part of all of this to us here at Puget Systems is the rise in the number of failures in the field, which we haven’t seen this high since 11th Gen. We’re seeing ALL of these failures happen after 6 months, which means we do expect elevated failure rates to continue for the foreseeable future and possibly even after Intel issues the microcode patch.

Based on this information, we are definitely experiencing CPU failures higher than our historical average, especially with 14th Gen. We have enough data to know that we don’t have an acute problem on the horizon with 13th Gen — it is more of a slow burn. We do expect an elevated failure rate on 14th Gen while Intel finishes finding a root cause and issuing a microcode update."
Pretty sure the point is that this isn't additional information and was always in the blog post.
 
  • Like
Reactions: bit_user
Status
Not open for further replies.