Review Intel Core i9-13900K and Core i5-13600K Review: Raptor Lake Beats Ryzen 7000

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The xilinx for mobile chips was interesting. But I'm a bit confused why it was implemented for mobile. AI workloads are specialty and heavy duty. They require a workstation class card that can handle heat/power. Mobile is power limited. So why?
AI in mobile is important for designers, photo/video-graphers and so on, it allows them to add effects or sort through tons of pics very fast with very little power draw. Intel incorporated AI into their igpus/laptops years ago for this reason.
However the AMD all performance cores, compared to intels hybrid approach of p and e isn't an apples to apples comparison. If Intel made 100% all P cores and tried to match AMDs core count it would be a heat power disaster.
Based on reviews that use "out-of-the-box" as an excuse to use overclocked settings on the cpus they review...
Look at results for 125W locked, it's tough to figure out how much the e-cores contribute but you can take that result (of the p-cores only) double it up so it is now at 2x125=250W and you would get the same amount of power draw you get now but with whatever 16 p-cores could perform instead of 8+16.
 

bit_user

Titan
Ambassador
The xilinx for mobile chips was interesting. But I'm a bit confused why it was implemented for mobile. AI workloads are specialty and heavy duty. They require a workstation class card that can handle heat/power. Mobile is power limited. So why?
Precisely because FPGAs can offer tremendous efficiency benefits for algorithms that fit well. In mobile, there are applications like speech recognition, which you potentially want to run continuously, and that means efficiency is at an absolute premium. They probably should've done a better job explaining this.

Intel is addressing these use cases through a special hardware block, called the GNA (Gaussian Neural Accelerator).
 
AI in mobile is important for designers, photo/video-graphers and so on, it allows them to add effects or sort through tons of pics very fast with very little power draw. Intel incorporated AI into their igpus/laptops years ago for this reason.

Based on reviews that use "out-of-the-box" as an excuse to use overclocked settings on the cpus they review...
Look at results for 125W locked, it's tough to figure out how much the e-cores contribute but you can take that result (of the p-cores only) double it up so it is now at 2x125=250W and you would get the same amount of power draw you get now but with whatever 16 p-cores could perform instead of 8+16.

I'm honestly shocked laptops would be the platform of choice for content creators. Limited storage and processing are bottlenecks to productivity.

But I'm old school. I asked for a desktop machine when my builds were taking 30+ minutes. Great excuse to get coffee. Bad excuse when your project isn't done.
 
Last edited:

bit_user

Titan
Ambassador
I'm honestly shocked laptops would be the platform of choice for content creators. Limited storage and processing along are bottlenecks to productivity.
Depends on which types of creatives we're talking about. I think many like to take their laptops to coffee shops and freelancers need a machine they can take on customer visits.

I'm not convinced by Terry's explanation, however. I know AMD is big on having Xilinx support for ROCm, but I rather doubt the AI blocks in these laptop CPUs are big enough to pack any real horsepower comparable to a dGPU. I think it's really just there for speech recognition, ambient noise removal, video conferencing filters, etc. As I mentioned before, Intel's GNA is focused on power-efficient acceleration of these use cases and pretty worthless for anything much heavier.

But I'm old school. I asked for a desktop machine when my builds were taking 30+ minutes. Great excuse to get coffee. Bad excuse when your project isn't done.
Anyone doing heavy-duty rendering will need a desktop/workstation. No question about that.
 
AMD got razzed pretty hard in the free chat about the vapor chamber cooler issue and high cost of motherboards. That was spot on. They also got razzed about the amount of execs they put on stage. A big snore fest.

The message was "we aren't interested in consumers any more." Funny because it's CES. CONSUMER electronics show.

The xilinx for mobile chips was interesting. But I'm a bit confused why it was implemented for mobile. AI workloads are specialty and heavy duty. They require a workstation class card that can handle heat/power. Mobile is power limited. So why?

The debate between which processor is better (AMD vs Intel) is a bit bogus. I'm going to be honest, Intel gives you better bang for your buck once a similar dollar platform is out together.

However the AMD all performance cores, compared to intels hybrid approach of p and e isn't an apples to apples comparison. If Intel made 100% all P cores and tried to match AMDs core count it would be a heat power disaster.

But not every process requires P cores. There are a few like video transcodes, AI workloads, some db ops, and compilers that will use all P cores to the max. But that is a very small percentage of us. For the average user Intel is the much better buy this round (even with extra heat)
Actually a lot of high level executives at dell, HP, Lenovo, etc. are now worried about Intel’s future in the mobile CPU space after AMD’s mobile presentation. One executive even said he wouldn’t be surprised if Intel was license manufacturing AMD CPU’s in 5 years.
 
Gelsinger has publicly stated that he welcomes AMD as a customer of IFS. But that's entirely separate from the issue of whether Intel can be competitive in that space.
I’m not talking about IFS, the executive said Intel will be licensing AMD architectures to produce under the Intel name. Similarly to how AMD got their start in the 1970’s-80’s. Big Difference
 

bit_user

Titan
Ambassador
I’m not talking about IFS, the executive said Intel will be licensing AMD architectures to produce under the Intel name. Similarly to how AMD got their start in the 1970’s-80’s. Big Difference
Sorry for the confusion; I understood your point.

I just wanted to mention a similar scenario that Intel is already welcoming. But, that's really about drumming up business for IFS (essentially, Intel's manufacturing division), which I think they are preparing to spin off.
 
I will say I got some information from an inside source who uses Intel. "They are refocusing on mainstream consumer and doing less industrial in terms of mfg capacity. Getting hold of supply for certain "active" industrial product lines is difficult." I found this most curious, to be honest.

It's like trying to find a supply of Gemini Lake replacements.

Ironic AMD seems to be doing the opposite.
 
Actually a lot of high level executives at dell, HP, Lenovo, etc. are now worried about Intel’s future in the mobile CPU space after AMD’s mobile presentation. One executive even said he wouldn’t be surprised if Intel was license manufacturing AMD CPU’s in 5 years.
AMD designs are using off the shelf modules, there is nothing there that intel couldn't produce on their own without even touching any of AMDs IP.
They can put the same amount and same kind of units into the same number of cores and clock them the exact same.
You could argue that TSMC is making better hardware and that intel is going to buy their chips from them in the future as others do, that at least has the facade of making sense.
I will say I got some information from an inside source who uses Intel. "They are refocusing on mainstream consumer and doing less industrial in terms of mfg capacity. Getting hold of supply for certain "active" industrial product lines is difficult." I found this most curious, to be honest.

It's like trying to find a supply of Gemini Lake replacements.

Ironic AMD seems to be doing the opposite.
Look at how much money intel (or amd for that matter) made in the last years from the client group compared to the datacenter group and it makes sense that they will focus more on client, that's where all the money is.
AMD is still forced to sell to datacenter to make enough money to get by, intel doesn't need to do it and has the ability to cut back.
 

bit_user

Titan
Ambassador
AMD designs are using off the shelf modules, there is nothing there that intel couldn't produce on their own without even touching any of AMDs IP.
They can put the same amount and same kind of units into the same number of cores and clock them the exact same.
LOL, wut? Are you maybe thinking of their ASIC cell libraries? Just using the same cell libraries doesn't say anything about the design or architecture of your cores or chip.
 
AMD designs are using off the shelf modules, there is nothing there that intel couldn't produce on their own without even touching any of AMDs IP.
They can put the same amount and same kind of units into the same number of cores and clock them the exact same.
You could argue that TSMC is making better hardware and that intel is going to buy their chips from them in the future as others do, that at least has the facade of making sense.

Look at how much money intel (or amd for that matter) made in the last years from the client group compared to the datacenter group and it makes sense that they will focus more on client, that's where all the money is.
AMD is still forced to sell to datacenter to make enough money to get by, intel doesn't need to do it and has the ability to cut back.
There’s so much wrong in that first paragraph that you should probably just delete your post.
 
  • Like
Reactions: bit_user