Review Intel Core i9-10980XE Review: Intel Loses its Grip on HEDT

Oct 27, 2019
3
1
10
Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.

And please stop using Windows to do that.
 
  • Like
Reactions: Thunder64

Pat Flynn

Distinguished
Aug 8, 2013
238
16
18,815
Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.

And please stop using Windows to do that.

While I agree that some Linux/Unix benchmarks should be present, the inclusion of gaming benchmarks helps not only pro-sumers, but game developers as well. It'll let them know how the CPU handles certain game engines, and whether or not they should waste tons of money on upgrading their dev teams systems.
Re: I used to build systems for Bioware...
 
  • Like
Reactions: ravewulf

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360
Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.

And please stop using Windows to do that.

Intel markets these chips at gamers, so we check the goods.

  • 9 game benchmarks
  • 28 workstation-focused benchmarks
  • 40 consumer-class application tests
  • boost testing
  • power/thermal testing, including efficiency metrics
  • overclocking testing/data.
I'm happy with that mix.
 
D

Deleted member 2783327

Guest
Disclaimer: I badly want to dump Intel and go AMD. But are the conditions right?

The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.

Someone once told me "No one in the world needs more than 16PCIe lanes, that's why mainstream CPUs have never gone over 16 lanes". If that were true the HEDT CPUs would not exist.

So we can say the 3950X destroys the Intel HEDT lineup, but only if you don't have anything other than ONE graphics card. As soon as you add other devices, you're blown.

The 3970X is $3199 where I am. That will drop by $100 by 2021.

The power consumption of 280w will cost me an extra $217 per year per PC. There are 3 HEDT PCs, so an extra $651 per year.

AMD: 1 PC @ 280w for 12 hours per day for 365 days at 43c per kilowatt hour = $527.74
Intel: 1 PC @ 165w for 12 hours per day for 365 days at 43c per kilowatt hour = $310.76

My 7900X is overclocked to 4.5GHZ all cores. Can I do that with any AMD HEDT CPU?

In summer the ambient temp here is 38 - 40 degrees Celsius. With a 280mm cooler and 11 case fans my system runs 10 degrees over ambient on idle, so 50c is not uncommon during the afternoons on idle. Put the system under load it easily sits at 80c and is very loud.

With a 280w CPU, how can I cool that? The article says that "Intel still can't deal with heat". Errr... Isn't 280w going to produce more heat than 165w. And isn't 165w much easier to cool? Am I missing something?

I'm going to have to replace motherboard and RAM too. That's another $2000 - $3000. With Intel my current memory will work and a new motherboard will set me back $900.

Like I said, I really want to go AMD, but I think the heat, energy and changeover costs are going to be prohibitive. PCIe4 is a big draw for AMD as it means I don't have to replace again when Intel finally gets with the program, but the other factors I fear are just too overwhelming to make AMD viable at this stage.

Darn it Intel is way cheaper when looked at from this perspective.
 
Last edited by a moderator:
Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.

And please stop using Windows to do that.

It's over pal... done, there is not even a single way to look at it the bright way, the 3950x is making the whole Intel HEDT offering a joke.

I would have give this chip a 2 stars, but we know toms and their double standards. The only time they cannot do it is when the data is just plain dead impossible to contest... like Anandtech described, it is a bloodbath.

I don't believe Intel will get back from this anytime soon.
 
  • Like
Reactions: ravewulf

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360
<snip>

The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.

<snip>
The article says that "Intel still can't deal with heat".
<snip>

I agree with the first point here, which is why we point out that Intel has an advantage there for users that need the I/O.

On the second point, can you point me to where it says that in the article? I must've missed it. Taken in context, it says that Intel can't deal with the heat of adding more 14nm cores in the same physical package, which is accurate if it wants to maintain a decent clock rate.
 

ezst036

Honorable
Oct 5, 2018
544
437
11,920
I'm surprised nobody caught this from the second paragraph of the article.

Intel's price cuts come as a byproduct of AMD's third-gen Ryzen and Threadripper processors, with the former bringing HEDT-class levels of performance to mainstream 400- and 500-series motherboards, while the latter lineup is so powerful that Intel, for once, doesn't even have a response.

For twice? This is a recall of the olden days of the first-gen slot-A Athlon processors. Now I'm not well-versed in TomsHardware articles circa 1999, but this was not hard to find at all:

Coppermine's architecture is still based on the architecture of Pentium Pro. This architecture won't be good enough to catch up with Athlon. It will be very hard for Intel to get Coppermine to clock frequencies of 700 and above and the P6-architecture may not benefit too much from even higher core clocks anymore. Athlon however is already faster than a Pentium III at the same clock speed, which will hardly change with Coppermine, and Athlon is designed to go way higher than 600 MHz. This design screams for higher clock speeds! AMD is probably for the first time in the very situation that Intel used to enjoy for such a long time. AMD might already be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.

https://www.tomshardware.com/reviews/athlon-processor,121-16.html

Intel didn't have a response back then either.
 
Disclaimer: I badly want to dump Intel and go AMD. But are the conditions right?

The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.

Someone once told me "No one in the world needs more than 16PCIe lanes, that's why mainstream CPUs have never gone over 16 lanes". If that were true the HEDT CPUs would not exist.

So we can say the 3950X destroys the Intel HEDT lineup, but only if you don't have anything other than ONE graphics card. As soon as you add other devices, you're blown.

The 3970X is $3199 where I am. That will drop by $100 by 2021.

The power consumption of 280w will cost me an extra $217 per year per PC. There are 3 HEDT PCs, so an extra $651 per year.

AMD: 1 PC @ 280w for 12 hours per day for 365 days at 43c per kilowatt hour = $527.74
Intel: 1 PC @ 165w for 12 hours per day for 365 days at 43c per kilowatt hour = $310.76

My 7900X is overclocked to 4.5GHZ all cores. Can I do that with any AMD HEDT CPU?

In summer the ambient temp here is 38 - 40 degrees Celsius. With a 280mm cooler and 11 case fans my system runs 10 degrees over ambient on idle, so 50c is not uncommon during the afternoons on idle. Put the system under load it easily sits at 80c and is very loud.

With a 280w CPU, how can I cool that? The article says that "Intel still can't deal with heat". Errr... Isn't 280w going to produce more heat than 165w. And isn't 165w much easier to cool? Am I missing something?

I'm going to have to replace motherboard and RAM too. That's another $2000 - $3000. With Intel my current memory will work and a new motherboard will set me back $900.

Like I said, I really want to go AMD, but I think the heat, energy and changeover costs are going to be prohibitive. PCIe4 is a big draw for AMD as it means I don't have to replace again when Intel finally gets with the program, but the other factors I fear are just too overwhelming to make AMD viable at this stage.

Darn it Intel is way cheaper when looked at from this perspective.
TDP is the wrong way to directly compare an Intel CPU with an AMD CPU. Neither vendor measures TDP in the same fashion so you should not compare them directly. On the most recent platforms, per watt consumed, you get more work done on the new AMD platform, plus most users don't have their chips running at max power 24/7, so why would you calculate your power usage against TDP even if it were comparable across brands?

Also, your need to have all of your cores clocked to a particular, arbitrarily chosen speed is a less than ideal metric to use if speed is not directly correlated to completed work, which after all is essentially what we want from a CPU.

If you really need to get so much work done that your CPU runs at it's highest power usage perpetually, the higher cost of the power consumption is hardly going to be your biggest concern.

How about idle and average power consumption, or completed work per watt, or even overall completed work in a given time-frame, which make a better case about AMD's current level of competitiveness.
 

Crashman

Polypheme
Former Staff
I'm surprised nobody caught this from the second paragraph of the article.

Intel's price cuts come as a byproduct of AMD's third-gen Ryzen and Threadripper processors, with the former bringing HEDT-class levels of performance to mainstream 400- and 500-series motherboards, while the latter lineup is so powerful that Intel, for once, doesn't even have a response.

For twice? This is a recall of the olden days of the first-gen slot-A Athlon processors. Now I'm not well-versed in TomsHardware articles circa 1999, but this was not hard to find at all:

Coppermine's architecture is still based on the architecture of Pentium Pro. This architecture won't be good enough to catch up with Athlon. It will be very hard for Intel to get Coppermine to clock frequencies of 700 and above and the P6-architecture may not benefit too much from even higher core clocks anymore. Athlon however is already faster than a Pentium III at the same clock speed, which will hardly change with Coppermine, and Athlon is designed to go way higher than 600 MHz. This design screams for higher clock speeds! AMD is probably for the first time in the very situation that Intel used to enjoy for such a long time. AMD might already be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.

https://www.tomshardware.com/reviews/athlon-processor,121-16.html

Intel didn't have a response back then either.
Fun times. The Tualatin was based on Coppermine and went to 1.4 GHz, outclassing Williamette at 1.8GHz by a wide margin. Northwood came out and beat it, but at the same time Intel was developing Pentium M based on...guess what? Tualatin.

And then Core came out of Pentium M, etc etc etc and it wasn't long before AMD couldn't keep up.

Ten years we waited for AMD to settle the score, and it's our time to enjoy their time in the sun.
 
D

Deleted member 2783327

Guest
I agree with the first point here, which is why we point out that Intel has an advantage there for users that need the I/O.

On the second point, can you point me to where it says that in the article? I must've missed it. Taken in context, it says that Intel can't deal with the heat of adding more 14nm cores in the same physical package, which is accurate if it wants to maintain a decent clock rate.

yes, sorry, my interpretation was not worded accurately.

Intel simply doesn't have room to add more cores, let alone deal with the increased heat, within the same package.

My point was that Intel is still going to be easier to cool producing only 165w vs AMD's 280w.

How do you calculate the watts, or heat for an overclocked CPU? I'm assuming the Intel is still more over-clockable than the AMD, so given the 10980XE's base clock of 3.00ghz, I wonder if I could still overclock it over 4.00ghz. How much heat would it produce then compared to the AMD?

Not that I can afford to spend $6000 to upgrade to the 3970X or $5000 to upgrade to the 3960X... And the 3950X is out because of PCIe lane limitations.

It looks like I'm stuck with Intel, unless I save my coins to go AMD. Makes me sick to the pit of my stomach :)
 
D

Deleted member 2783327

Guest
How about idle and average power consumption, or completed work per watt, or even overall completed work in a given time-frame, which make a better case about AMD's current level of competitiveness.

All great points. I feel that it is going to come down the the tangibles for me. Initial outlay in $ and $ per year to run them.

Anyway, I think I've probably taken the conversation away from the original topic, so I'll leave it there.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360
I'm surprised nobody caught this from the second paragraph of the article.

Intel's price cuts come as a byproduct of AMD's third-gen Ryzen and Threadripper processors, with the former bringing HEDT-class levels of performance to mainstream 400- and 500-series motherboards, while the latter lineup is so powerful that Intel, for once, doesn't even have a response.

For twice? This is a recall of the olden days of the first-gen slot-A Athlon processors. Now I'm not well-versed in TomsHardware articles circa 1999, but this was not hard to find at all:

Coppermine's architecture is still based on the architecture of Pentium Pro. This architecture won't be good enough to catch up with Athlon. It will be very hard for Intel to get Coppermine to clock frequencies of 700 and above and the P6-architecture may not benefit too much from even higher core clocks anymore. Athlon however is already faster than a Pentium III at the same clock speed, which will hardly change with Coppermine, and Athlon is designed to go way higher than 600 MHz. This design screams for higher clock speeds! AMD is probably for the first time in the very situation that Intel used to enjoy for such a long time. AMD might already be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.

https://www.tomshardware.com/reviews/athlon-processor,121-16.html

Intel didn't have a response back then either.

Good point, fixed.
 
  • Like
Reactions: Nick_C and ravewulf

InvalidError

Titan
Moderator
The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.
No, it is 24 PCIe lanes from the CPU (four going to the chipset and four more usually going to an M2 NVMe slot) and depending on the motherboard you pick, you can have up to 20 more PCIe lanes from the x570 chipset for a total of up to 40x useable PCIe 4.0. Of course, most boards will configure some of the HSIO lanes for extra SATA, USB3 or M.2 ports instead and those won't be available for PCIe. Even if you are left with 10 spare PCIe lanes, that's enough for two PCIe4.0x4 and two 4.0x1 slots. May not sound like much but you need to keep in mind that 4.0x1 is 16Gbps, almost fast enough for dual-10G. Also, a growing number of higher-end boards have 10G on-board.

For USB3, just get a USB3 hub. USB3 is double-simplex, so copying files from port 0 to port 1 on the same hub is going to go at about the same speed as doing the same between different host ports directly on the PC. If you are concerned about losing even 1% performance, then use two hubs connected to two separate host ports and then copy from one hub to the other instead.

For most normal desktop users, x570 is overkill. If you need more than that, then desktop hardware clearly isn't for you.
 

TJ Hooker

Titan
Ambassador
All great points. I feel that it is going to come down the the tangibles for me. Initial outlay in $ and $ per year to run them.

Anyway, I think I've probably taken the conversation away from the original topic, so I'll leave it there.
I don't really get your reasoning. You're comparing a 32 core CPU to an 18 core. Of course it's going to cost more, and in return you get more performance. If you're actually running it full bore 12 hours a day, 365 days a year as you did in your power costs calculation, I can only assume you'd be using it for work in which case it'd likely end up paying for itself through increased productivity.

Regarding power consumption/heat dissipation, your current 7900X @ 4.5 GHz is already probably drawing 200+ W, or would if you were to run the same stress tests as TH does to measure power consumption.
 
  • Like
Reactions: ravewulf

Xajel

Distinguished
Oct 22, 2006
167
8
18,685
Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.

And please stop using Windows to do that.

While I agree with the compiling and Linux benchmarks... I don't on the gaming side...

Game developers are interested, and even some guys who actually use these to work most of the time, "might" do some gaming some times...
 
I'm surprised nobody caught this from the second paragraph of the article.

Intel's price cuts come as a byproduct of AMD's third-gen Ryzen and Threadripper processors, with the former bringing HEDT-class levels of performance to mainstream 400- and 500-series motherboards, while the latter lineup is so powerful that Intel, for once, doesn't even have a response.

For twice? This is a recall of the olden days of the first-gen slot-A Athlon processors. Now I'm not well-versed in TomsHardware articles circa 1999, but this was not hard to find at all:

Coppermine's architecture is still based on the architecture of Pentium Pro. This architecture won't be good enough to catch up with Athlon. It will be very hard for Intel to get Coppermine to clock frequencies of 700 and above and the P6-architecture may not benefit too much from even higher core clocks anymore. Athlon however is already faster than a Pentium III at the same clock speed, which will hardly change with Coppermine, and Athlon is designed to go way higher than 600 MHz. This design screams for higher clock speeds! AMD is probably for the first time in the very situation that Intel used to enjoy for such a long time. AMD might already be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.

https://www.tomshardware.com/reviews/athlon-processor,121-16.html

Intel didn't have a response back then either.
Intel consciously neglected traditional CPUs back then to try and make itanium take off.Having an even stronger x86 competition than was already present would have been an even worse idea.
Intel was consciously neglecting traditional cores in the last years to push out optane ram that they sell for ~$8000 a pop right now and to put laptop CPUs on m.2 cards as co-processors plus the new i/GPU and nervana.

While the -10980XE is thoroughly unimpressive from a new technology standpoint,
Yes if you don't look at the new technologies then you won't see them...
  • Intel® Deep Learning Boost (Intel® DL Boost)Yes
  • Intel® Optane™ Memory Supported ‡Yes
  • # of AVX-512 FMA Units2
 

Xajel

Distinguished
Oct 22, 2006
167
8
18,685
Disclaimer: I badly want to dump Intel and go AMD. But are the conditions right?

The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.

Someone once told me "No one in the world needs more than 16PCIe lanes, that's why mainstream CPUs have never gone over 16 lanes". If that were true the HEDT CPUs would not exist.

So we can say the 3950X destroys the Intel HEDT lineup, but only if you don't have anything other than ONE graphics card. As soon as you add other devices, you're blown.

The 3970X is $3199 where I am. That will drop by $100 by 2021.

The power consumption of 280w will cost me an extra $217 per year per PC. There are 3 HEDT PCs, so an extra $651 per year.

AMD: 1 PC @ 280w for 12 hours per day for 365 days at 43c per kilowatt hour = $527.74
Intel: 1 PC @ 165w for 12 hours per day for 365 days at 43c per kilowatt hour = $310.76

My 7900X is overclocked to 4.5GHZ all cores. Can I do that with any AMD HEDT CPU?

In summer the ambient temp here is 38 - 40 degrees Celsius. With a 280mm cooler and 11 case fans my system runs 10 degrees over ambient on idle, so 50c is not uncommon during the afternoons on idle. Put the system under load it easily sits at 80c and is very loud.

With a 280w CPU, how can I cool that? The article says that "Intel still can't deal with heat". Errr... Isn't 280w going to produce more heat than 165w. And isn't 165w much easier to cool? Am I missing something?

I'm going to have to replace motherboard and RAM too. That's another $2000 - $3000. With Intel my current memory will work and a new motherboard will set me back $900.

Like I said, I really want to go AMD, but I think the heat, energy and changeover costs are going to be prohibitive. PCIe4 is a big draw for AMD as it means I don't have to replace again when Intel finally gets with the program, but the other factors I fear are just too overwhelming to make AMD viable at this stage.

Darn it Intel is way cheaper when looked at from this perspective.


Dude chill out, you're mixing the bags here...

3950x is a mainstream AM4 CPU, so yes you have limited PCIe lanes, if you need more you must go to HEDT from either Intel or AMD.

Your Power consumption is all mixed up, you mentioned 3950x, the AM4 one which has 105W TDP and said it's 280W, it's not, the 280W TDP is for the HEDT 3960x/3970x Threadripper. Which do have much more PCIe lanes..

Second, check the actual power consumptions again, TDP definition on both AMD and Intel is different, AMD will rarely exceed the official TDP even on PBO. While Intel will do it more... , go to the second page and check the power your self specially when overclocked, An overclocked i9-9980XE will easily reach over 320~340W, but at stock clocks these will stay within the TDP.

Also, Power consumption will not stay at the TDP in all times, not in Intel, not on AMD, it all depends on the workload and what you're doing... so depending on your actual work you can see what CPU is more efficient, but you have to test them or see and actual benchmark, you can't do this depending on the TDP.
 

AlistairAB

Distinguished
May 21, 2014
229
60
18,760
Disclaimer: I badly want to dump Intel and go AMD. But are the conditions right?

The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.

Someone once told me "No one in the world needs more than 16PCIe lanes, that's why mainstream CPUs have never gone over 16 lanes". If that were true the HEDT CPUs would not exist.

So we can say the 3950X destroys the Intel HEDT lineup, but only if you don't have anything other than ONE graphics card. As soon as you add other devices, you're blown.

The 3970X is $3199 where I am. That will drop by $100 by 2021.

The power consumption of 280w will cost me an extra $217 per year per PC. There are 3 HEDT PCs, so an extra $651 per year.

AMD: 1 PC @ 280w for 12 hours per day for 365 days at 43c per kilowatt hour = $527.74
Intel: 1 PC @ 165w for 12 hours per day for 365 days at 43c per kilowatt hour = $310.76

My 7900X is overclocked to 4.5GHZ all cores. Can I do that with any AMD HEDT CPU?

In summer the ambient temp here is 38 - 40 degrees Celsius. With a 280mm cooler and 11 case fans my system runs 10 degrees over ambient on idle, so 50c is not uncommon during the afternoons on idle. Put the system under load it easily sits at 80c and is very loud.

With a 280w CPU, how can I cool that? The article says that "Intel still can't deal with heat". Errr... Isn't 280w going to produce more heat than 165w. And isn't 165w much easier to cool? Am I missing something?

I'm going to have to replace motherboard and RAM too. That's another $2000 - $3000. With Intel my current memory will work and a new motherboard will set me back $900.

Like I said, I really want to go AMD, but I think the heat, energy and changeover costs are going to be prohibitive. PCIe4 is a big draw for AMD as it means I don't have to replace again when Intel finally gets with the program, but the other factors I fear are just too overwhelming to make AMD viable at this stage.

Darn it Intel is way cheaper when looked at from this perspective.

There is so much wrong in your post. Your power consumption calculations are completely pointless, you need the energy per job done. Energy to render for example. You don't sit there with your CPU maxed all day. And you're comparing 18 vs 24 cores... And don't use TDP for that, that is a mistake. You are completely ignoring performance per watt, and idle power etc.

You sound like an apologist for Intel who is either confused or you are just trying to find arguments that don't hold up. The actual max extra wattage is about 50-60 watts. IF you can't cool an additional 60 watts, you have no idea how to use or build a computer. My company just bought 3 Epyc servers where they replaced DUAL 18 core Intel CPUs (400W) with the equivalent single 32 core AMD CPU (250W), guess which servers use a LOT LESS POWER?

Who cares about the actual frequency the CPU runs at. More nonsense. Intel and AMD servers use the same RAM. Seriously, do you even use computers? Enough nonsense. I dealt with this idiot who was like, I can just slot a new Intel CPU into my existing motherboard. I said, are you dumb? Just sell the Intel CPU and motherboard and replace with AMD you still come out ahead. You have to include the money you get for selling your old Intel system, you can't do the comparison the way you did it, that was laughable.
 
  • Like
Reactions: Thunder64

atomicWAR

Glorious
Ambassador
This launch is just embarrassing for Intel and it is the tip of the iceberg as far as I can tell. We haven't seen this kind of AMD dominance since they launched their first x64 CPU's leaving Intel Pentuim 4's as overpriced space heaters that doubled as slow computers. And again when AMD launched the first dual cores not much later leaving Intel to figure out how to jam two of those hot P4 cores into a single package to compete. Oh the Pentuim D...that poor poor CPU. But hey at least they could do that much back then. As it stands now Intel seems to be out of moves until 10nm drops or they skip it altogether. Intel already had their Pentuim D moment with the Xeon Platinum 9282 which fell very very short, much like the Pentuim D did as well. So where do they go from there? I mean the rumored back port of ice lake to 14nm may help but I worry what started of as a stumble for Intel could end up in a full fledged fall. It's probably unlikely this will slay the CPU giant but that's the problem about the thing that kills you. You rarely see it coming.

In all likelihood foveros 3d chip stacking and other packaging/ MCM/ interconnect technologies for which Intel is headed will end up saving their bacon. But we know AMD is heading in the same direction and Intel can't wait to long to make their move or they risk losing even more mind-share then they already have. Between security SNAFUs, inferior core counts/ IPC (not including gaming), platform longevity and cost, Intel has steadily lost it's luster as a "premium vendor". I am no fanboy but Intel is leaving many folks with a bad taste in their mouths. I like cost efficiency and the best tech I can get my hands on, yes I know how at odds those two needs are (lol).

Intel has not treated it's customers well for some time and we all know it nor are many willing forget it as long as performance is similar. At least not in the DYI space and my guess is the corprate space won't be much kinder in the long run. I had a lot of AMD systems prior to the faildozer debacle. So Intel won me over with it's performance and at the time fair prices for the day. I even added a Xeon 1680 V2 to my old x79 system to buy me time to see how this little CPU war plays out (after seeing how new games love 8c/16t CPUs I am very happy I did that FYI). Anyways when PCIE 5/ DDR5 drops in what I hope is 2021, if I have to chose between 5% or less IPC at similar core counts for Intel/AMD....assuming Intel catches up. I won't be buying Intel. Watching them jack up prices while not giving much improvement has been maddening. Let's be honest when sandy bridge launched, 6c/12t should have been the mainstream high end CPU, not the HEDT those should have gone up to 10c/20t. And take that thinking further Haswell's mainstream CPUs would have been 8c/16t threads, Broadwell 10c/20t, Skylake 12c/24t and Coffee lake should have been 14c/28t and then comet lake 16c/32t. With each new higher core count costing the same as lesser core count flagship it replaced. But Intel didn't see the value in treating it's consumer base well because they could get away with it as AMD had nothing to challenge them. Now Intel has made its bed but I am guessing they aren't to keen to lay in it even if they don't have a choice in the matter. I only hope AMD doesn't make the same mistakes now they have climbed on top again. It's not a place they have been often or for long. I am interested to see how this all plays out.
 
Last edited:
yes, sorry, my interpretation was not worded accurately.

Intel simply doesn't have room to add more cores, let alone deal with the increased heat, within the same package.

My point was that Intel is still going to be easier to cool producing only 165w vs AMD's 280w.

How do you calculate the watts, or heat for an overclocked CPU? I'm assuming the Intel is still more over-clockable than the AMD, so given the 10980XE's base clock of 3.00ghz, I wonder if I could still overclock it over 4.00ghz. How much heat would it produce then compared to the AMD?

Not that I can afford to spend $6000 to upgrade to the 3970X or $5000 to upgrade to the 3960X... And the 3950X is out because of PCIe lane limitations.

It looks like I'm stuck with Intel, unless I save my coins to go AMD. Makes me sick to the pit of my stomach :)
There is more to the power usage than just the CPU. What you really need to know is total system power draw and guess what the AMD is better than Intel at that, even with 14 more cores. https://www.servethehome.com/amd-ryzen-threadripper-3970x-review-32-cores-of-madness/6/ The reason that happens is that with more cores each core doesn't have to run as hard to get more work done on parallel tasks.
 
Let's be honest when sandy bridge launched, 6c/12t should have been the mainstream high end CPU, not the HEDT those should have gone up to 10c/20t. And take that thinking further Haswell's mainstream CPUs would have been 8c/16t threads, Broadwell 10c/20t, Skylake 12c/24t and Coffee lake should have been 14c/28t and then comet lake 16c/32t. With each new higher core count costing the same as lesser core count flagship it replaced. But Intel didn't see the value in treating it's consumer base well because they could get away with it as AMD had nothing to challenge them.
Yes let's be honest here,we can all bash intel all we want but if intel had done all that there would be no AMD left today...a 6c/12t sandybridge would have overshadowed anything AMD had by so much that nobody would have stayed with them.
The i9-10980XE on the other hand still has several wins over the, almost double the cores,threadrippers even outside of gaming even if you need to overclock the intel part.

Intel loses on the very scalable stuff but intel also has xeon phi for this type of workload and yes they also come as co-processors as well...
 

InvalidError

Titan
Moderator
Yes let's be honest here,we can all bash intel all we want but if intel had done all that there would be no AMD left today...a 6c/12t sandybridge would have overshadowed anything AMD had by so much that nobody would have stayed with them.
Sandy Bridge was already a 200+sqmm die and is getting pretty big by Intel mainstream CPU die size standards. Intel likes to keep mainstream CPUs closer to 150sqmm. While Intel could have axed the IGP to make room, it still isn't giving up the IGP for Comet Lake, so it clearly looks like IGPs are here to stay in mainstream CPUs regardless of how much it may hurt Intel's yield per wafer. Can't really blame them when most PCs ship to offices and other locations that don't require much more than something to connect one or more monitors to.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360
be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.
Intel consciously neglected traditional CPUs back then to try and make itanium take off.Having an even stronger x86 competition than was already present would have been an even worse idea.
Intel was consciously neglecting traditional cores in the last years to push out optane ram that they sell for ~$8000 a pop right now and to put laptop CPUs on m.2 cards as co-processors plus the new i/GPU and nervana.


Yes if you don't look at the new technologies then you won't see them...
  • Intel® Deep Learning Boost (Intel® DL Boost)Yes
  • Intel® Optane™ Memory Supported ‡Yes
  • # of AVX-512 FMA Units2

We did mention DL boost in the article. Optane memory support refers to the storage device, which is confusing due to the way Intel brands Optane. That's been there for ages. Finally, the dual FMA units have been there for three gens now.
 
Last edited:
  • Like
Reactions: TJ Hooker