News Intel's Core i3-12100 Demolishes The Ryzen 3 3300X In An Early Review

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It's been like this since the launch everywhere. Alder Lake is not selling great and that's a fact, numerous reports said so already.
I'm just curious how long will this last... will it sell as bad up to Raptor Lake too? Because that would be a big fail.

People who are not worried about HIGH, VERY HIGH and ABSURDLY HIGH power consumption are ignorants.

At the rate CPUs and GPUs are increasing power consumption each gen, we will not only need to change the PSU every upgrade, but also the air conditioning in our rooms... not to mention that at some point the electricity costs will double or triple (if that did not happen already).

Yes, IGNORANTS (on high horses, blinded by more money than sense).

A few things - Maybe Alder Lake isn't selling well, idk. It's mostly speculation and some anecdotes about local retailers. Whether or not Alder Lake sells well, factually, will need to be determined at the next quarterly report.

Of course we should be concerned about "HIGH, VERY HIGH and ABSURDLY HIGH" power consumption, but I wouldn't consider a 12600K under maximum load pulling 10 more watts than a 5800X to be absurd at all. Anandtech observed 5800X pulling 140W under full load vs the 150W draw of the 12600K. Nobody cares about that 10W spread.

12900K could have its power consumption cut in half and only lose 10% performance - it's obvious they're factory overclocked past the efficiency curve, and OEMs are free to set reasonable power limits. And besides, these are power draw measurements of a CPU at all core, 100% utilization, which I can't remember a time I've done that. It's like arguing MPG's in a car when both are floored. In games, 12600K is drawing more frames per second per watt consumed vs a 5900X
 
Ignorant? It's called having an opinion. Just because someone doesn't agree with you doesn't make them ignorant. Maybe it's you who is ignorant?
Don't take it the wrong way, but advocating to use more power "just because it's faster" is never a good take. There's a lot of reasons why that is important to understand and accept. We all want faster things for our day to day use and needs, but we have to keep in check the side effects. I'm not going to go on a political or scientific tangent about Climate Change or Power Generation around the globe, but the "hurdurr moar power, no care consequences" mentality is neanderthalistic. Pressuring companies to give things that are fast should be as important as requiring them to be energy efficient.

We don't need to be "tree huggers" to realize this is not a good trend and we'd be coming full circle from the early Industrial Age if we move into that "get more stuff, consequences be damned" mentality.

So, even if you have reservations or even dislike the qualifier, I don't think it's bad to call such people ignorant or just "bad people", but I'd like to think ignorant people just requires more information to understand things and change their opinion on things.

Regards.
 
  • Like
Reactions: kwohlt and VforV
I would say this is a "wet your appetite" loss leader chip. What does this mean?

You ever see black Friday ads from like 10 years ago? They sold things at impossibly low values and it made everyone drool? Yet when you get to the store they are out of stock quick (limited supply) or force a warranty on you (expensive motherboard). But it worked. It got you in the door and looking at their products.

Clever marketing on Intel's part. But most of us won't see the benefits of it. That said, Renoir-X is a joke answer. Neither camp has the capacity to mass sell low margin chips when they sell out all their high margin chips.

Despite a speed boost, AMD's Renoir-X offering isn't even in the same ballpark. I would go so far to say that even Zen 3 low end with VCache wouldn't be able to beat Intel on all metrics. (Based on estimated 20% boost, but compare against Performance:cost and quick sync for things like Premiere/Photoshop) That's just marketing 101.

Competition is a good thing.
 
Don't take it the wrong way, but advocating to use more power "just because it's faster" is never a good take. There's a lot of reasons why that is important to understand and accept. We all want faster things for our day to day use and needs, but we have to keep in check the side effects. I'm not going to go on a political or scientific tangent about Climate Change or Power Generation around the globe, but the "hurdurr moar power, no care consequences" mentality is neanderthalistic. Pressuring companies to give things that are fast should be as important as requiring them to be energy efficient.

We don't need to be "tree huggers" to realize this is not a good trend and we'd be coming full circle from the early Industrial Age if we move into that "get more stuff, consequences be damned" mentality.

So, even if you have reservations or even dislike the qualifier, I don't think it's bad to call such people ignorant or just "bad people", but I'd like to think ignorant people just requires more information to understand things and change their opinion on things.

Regards.


Yes and no. To enthusiast, they don't think about environmental impacts. Total cost of ownership is a secondary metric. I mean look at all the 3090's and 3080's being sold for ABSURD amount of money. When you run a business, you only care about Total Cost Of Ownership which indirectly ties into efficiency.

TCO = Base Cost (CPU + Motherboard + Memory + PSU + Cooler) + Energy Consumed Cost (kWH * cost per kWH) over the entire lifetime. (This doesn't factor in maintenance which is expected to be minimal)
Value per dollar = work done / TCO. (This is what businesses look at)

If you look at the A100, it's TCO is absurdly high. Yet the amount of work it does in AI loads is astounding, raising the Value per dollar higher than it's next competitors. But if you don't use it for AI loads, you are wasting money. That's why cloud servers use varied architectures based on their strengths. For example, EPYC is winning because not only does it have a sheer core count, but it has an incredible PCIe lane count, which means drive data access is exceptional. (Once they get the kinks worked out...but it's getting better and still better than Intels best XEON offerings) So for database lookups, EPYC is a beast.
 
  • Like
Reactions: jacob249358
Ignorant? It's called having an opinion. Just because someone doesn't agree with you doesn't make them ignorant. Maybe it's you who is ignorant?
no, but thinking the cost of power is the same ever where as it is in the US, is ignorant.

where i am, there is a 2 tier system, once you hit that 2nd tier, if i remember right, the price per k/wh almost doubles. cause of that, most here keep an eye on how much power they are using.
 
Yes and no. To enthusiast, they don't think about environmental impacts. Total cost of ownership is a secondary metric. I mean look at all the 3090's and 3080's being sold for ABSURD amount of money. When you run a business, you only care about Total Cost Of Ownership which indirectly ties into efficiency.

TCO = Base Cost (CPU + Motherboard + Memory + PSU + Cooler) + Energy Consumed Cost (kWH * cost per kWH) over the entire lifetime. (This doesn't factor in maintenance which is expected to be minimal)
Value per dollar = work done / TCO. (This is what businesses look at)

If you look at the A100, it's TCO is absurdly high. Yet the amount of work it does in AI loads is astounding, raising the Value per dollar higher than it's next competitors. But if you don't use it for AI loads, you are wasting money. That's why cloud servers use varied architectures based on their strengths. For example, EPYC is winning because not only does it have a sheer core count, but it has an incredible PCIe lane count, which means drive data access is exceptional. (Once they get the kinks worked out...but it's getting better and still better than Intels best XEON offerings) So for database lookups, EPYC is a beast.
You're just justifying the "more power is better" from an economical point of view, which is not taking into account the long term economical impacts of said power. To the TCO you can always add a cost based on the impact of using more power like car manufacturers deal with the "regulatory credits" such as carbon bonds/credits. That would shift your equation towards more efficient designs all the time and Companies wouldn't buy high powered stuff without an economical offset for those credits. I hope this makes sense?

So, in short, you can't take the "enterprise" side of the justification, because they will only care when there's a economical incentive to do so (Capitalism!). One side of that incentive is client pressure (via purchase or request), which includes us regular consumers and the other is Govt initiatives, which I won't delve into for obvious reasons and because different regions deal with this differently.

All in all, just "because it doesn't give me problems" it means you should turn a blind eye to it. Or at least, that's my take. Every little helps at the end of the day.

Regards.
 
... Of course we should be concerned about "HIGH, VERY HIGH and ABSURDLY HIGH" power consumption, but I wouldn't consider a 12600K under maximum load pulling 10 more watts than a 5800X to be absurd at all. Anandtech observed 5800X pulling 140W under full load vs the 150W draw of the 12600K. Nobody cares about that 10W spread.

12900K could have its power consumption cut in half and only lose 10% performance - it's obvious they're factory overclocked past the efficiency curve, and OEMs are free to set reasonable power limits. And besides, these are power draw measurements of a CPU at all core, 100% utilization, which I can't remember a time I've done that. It's like arguing MPG's in a car when both are floored. In games, 12600K is drawing more frames per second per watt consumed vs a 5900X.

I have a feeling that this part of your comment will just be ignored by most people here. People don't like myth busting specially if they believe the myth. :)
 
Last edited:
You're just justifying the "more power is better" from an economical point of view, which is not taking into account the long term economical impacts of said power. To the TCO you can always add a cost based on the impact of using more power like car manufacturers deal with the "regulatory credits" such as carbon bonds/credits. That would shift your equation towards more efficient designs all the time and Companies wouldn't buy high powered stuff without an economical offset for those credits. I hope this makes sense?

So, in short, you can't take the "enterprise" side of the justification, because they will only care when there's a economical incentive to do so (Capitalism!). One side of that incentive is client pressure (via purchase or request), which includes us regular consumers and the other is Govt initiatives, which I won't delve into for obvious reasons and because different regions deal with this differently.

All in all, just "because it doesn't give me problems" it means you should turn a blind eye to it. Or at least, that's my take. Every little helps at the end of the day.

Regards.

TCO includes energy consumed cost which is in the equation I wrote.
 
TCO includes energy consumed cost which is in the equation I wrote.
Energy consumption is not the same as an additional cost for running using a lot of power though. EDIT: I don't know how to word it better, sorry. Electrical company cost is not the same as an additional tax on top of it? Something along those lines.

As I said, similar to the car industry. The "MPG" metric is not the only thing driving electrification.

Regards.
 
  • Like
Reactions: digitalgriffin
Energy consumption is not the same as an additional cost for running using a lot of power though. EDIT: I don't know how to word it better, sorry. Electrical company cost is not the same as an additional tax on top of it? Something along those lines.

As I said, similar to the car industry. The "MPG" metric is not the only thing driving electrification.

Regards.

In the US that would Demand charges or Tariffs depending on where you are, weirdly known as Rider rates for commercial purposes (they treat it as noun). Most residential customers don't get access to scheduled metering. But in some regions it is common with overnight hours being cheaper (thus encouraging solar panels and conservation during the day, and even storing grid power on battery for use during the next day)

Even buying electricity from charging stations is cheaper MPGe than gasoline right now in the US.
 
  • Like
Reactions: -Fran-
no, but thinking the cost of power is the same ever where as it is in the US, is ignorant.

where i am, there is a 2 tier system, once you hit that 2nd tier, if i remember right, the price per k/wh almost doubles. cause of that, most here keep an eye on how much power they are using.
what if I told you I wasn't from the US