Review Intel Core Ultra 9 285K Review: Intel Throws a Lateral with Arrow Lake

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

irish_adam

Distinguished
Mar 30, 2010
236
64
18,760
Yes, the low power is very important, I've been making the same argument.

Saying I'm biased doesn't make you right. I've made 3 points of why the 285k is better than the 9950x. Are any of them wrong? Do reviews show otherwise?


See, this is exactly why you are biased and it shows. The faster 8200 memory made no difference yet you had to throw it in to make an impression that it is a more expensive platform when it isn't. That's textbook definition of bias.



It's nothing to be fixed, that's how it is supposed to work. The amd drivers literally turn of one ccd during games. That's how amd fixed the shdeduling issue. On the reviews where you see 7950x 3d being slower than the 7800x 3d it's because they didn't follow the amd guidelines for the CCDs.. The 7950x 3d when installed properly is always faster than the 7800x 3d in games
So I don't see the issue you raised then, who cares if the CCD is auto disabled if it gives better performance? Even if it loses to the 9800X3D it will still trounce this CPU and the 14900k.

You are literally going against the entire industry in your view of this CPU. You have literally gone to extraordinary lengths to argue with everyone here to prove that this CPU is "the GOAT" you can acuse me of all the bias you like but no one is buying what you are selling so I guess knock yourself out mate.
 
"The new chips come with 24 lanes of PCIe 5.0 support, with..."

... so, we'll all be surprised when Battlemage cards arrive with 16 lanes of PCIe 5.0? The gaming reviews will need to be rewritten.

The double data rate PCIE5 should benefit AI processing even more.
Sorry, but no. For gaming in general, PCIe bandwidth simply isn't a primary factor. The reason is simple: the GPU may have anywhere from ~250 to over 1000 GB/s of internal VRAM bandwidth. It's over an order of magnitude faster than the PCIe interface bandwidth.

There are certain workloads where PCIe bandwidth matters, as you note with it potentially boosting AI performance. But even that doesn't often use PCIe that much, unless you're doing multi-GPU workloads. For a single GPU? Even PCIe 3.0 x16 will generally suffice (just not on Intel Arc, which has some problematic design constraints and drivers that want PCIe 4.0).

And of course this doesn't account for things like PCIe x8 and x4 interfaces instead of the full x16, which further restrict the bandwidth. PCIe 2.0 x16 would probably cause a modest (maybe 5~10 percent?) loss in gaming performance on a fast modern GPU. PCIe 3.0 x8 and PCIe 4.0 x4 have the same bandwidth (basically) as PCIe 2.0 x16 and thus are in the same boat. There's a good reason why it's only the budget GPUs that get saddled with an x8 link connection while maintaining the x16 physical connector.
 

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
So I don't see the issue you raised then, who cares if the CCD is auto disabled if it gives better performance? Even if it loses to the 9800X3D it will still trounce this CPU and the 14900k.

You are literally going against the entire industry in your view of this CPU. You have literally gone to extraordinary lengths to argue with everyone here to prove that this CPU is "the GOAT" you can acuse me of all the bias you like but no one is buying what you are selling so I guess knock yourself out mate.
And then upgrade to 9950X3D ;)
 

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
So I don't see the issue you raised then, who cares if the CCD is auto disabled if it gives better performance? Even if it loses to the 9800X3D it will still trounce this CPU and the 14900k.

You are literally going against the entire industry in your view of this CPU. You have literally gone to extraordinary lengths to argue with everyone here to prove that this CPU is "the GOAT" you can acuse me of all the bias you like but no one is buying what you are selling so I guess knock yourself out mate.
And saw some talk with some video review like this, seems like it is still very unstable/not ready for launch yet.
View: https://www.youtube.com/watch?v=6QoCCFXD0xc
 

Peksha

Prominent
Sep 2, 2023
44
31
560
This is a epic fail - increase the memory latency to 15-20 ns, while the distance between the chiplets in Intel is many orders of magnitude smaller between tiles than in AMD with IF.
Even CUDIMM 10000+ won't fix this
 
Those +20 watts come from X670E chipset being power hungry.
If you go with B650, idle power is in line with Intel at around 50W.

For some reason this never gets mentioned in the reviews.
This is news to me. I've always seen and assumed it was the interposer/mesh sucking power, since most of those measurements were done at the CPU power lines.

Do you have a source for this, please?

Regards.
 
  • Like
Reactions: helper800

logainofhades

Titan
Moderator
And there are games like spiderman were the 285k dominates everything. Those are what we call outliers, they are meaningless for most people unless they specifically play that particular one game 24/7. Most people I'd assume play a variety of games and that's why they look at the averages. On average the 7800x 3d is what, 10-15% faster than the normal 7950x?

Sorry but you seem to have this assumption the CPU doesn't really matter, but it does, and not just in the so called outliers as you call them.

View: https://www.youtube.com/watch?v=98RR0FVQeqs
 

JamesJones44

Reputable
Jan 22, 2021
851
779
5,760
That only happens on the asus hero and / or other asus mobos. It's not the cpus thats' doing it, asus mingled the CPU vrms into the atx 24pin.

This is taken from an MSI mobo as explained in their review exactly because the hero does the weird shaenanigans. So both tom's and TPU agree that overall power draw in gaming is lower than the 9950x. Which is good, no?

D9-DF966-A-C5-DC-43-C2-B682-220727-EC7554.png
Given the lack of performance specifically for gaming I would say not good. The average at eyes glance is about 20 watts. If you played games for 4 hours a day that's only 80 watts a day, if you played every day for 365 days it would only be 30 KWh for the full year (rounded up). Even in crazy places like CA and HI that would only be a $15 savings a year ($0.50 a KWh). For the US average it's only $6 a year savings ($0.19 a KWh).

It's better than were Intel was and I'm happy they are getting competitive on that front, but Arrow Lake is a hard sell at prices that are much higher than their AMD or RTL counterparts gaming performance wise.
 
Last edited:
  • Like
Reactions: helper800

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
So I don't see the issue you raised then, who cares if the CCD is auto disabled if it gives better performance? Even if it loses to the 9800X3D it will still trounce this CPU and the 14900k.

You are literally going against the entire industry in your view of this CPU. You have literally gone to extraordinary lengths to argue with everyone here to prove that this CPU is "the GOAT" you can acuse me of all the bias you like but no one is buying what you are selling so I guess knock yourself out mate.
Who cares? Whoever wants to actually use the 16 core cpu he paid for as a 16 core. What kind of a silly question is that man?

I don't care if I'm going against the industry, my thesis is backed up by the data. Also your thesis seems to be based on future yet unreleased cpus. Just so you won't have to admit in the here and now it's the goat. As I've said before when the 9950x 3d is released and tested, we will see what's what. Until then, 285k baby.
 
Last edited:

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Given the lack of performance specifically for gaming I would say not good. The average at eyes glance is about 20 watts. If you played games for 4 hours a day that's only 80 watts a day, if you played every day for 365 days it would only be 30 KWh for the full year (rounded up). Even in crazy places like CA and HI that would only be a $15 savings a year ($0.50 a KWh). For the US average it's only $6 a year savings ($0.19 a KWh).

It's better than were Intel was and I'm happy they are getting competitive on that front, but Arrow Lake is a hard sell at prices that are much higher than their AMD or RTL counterparts gaming performance wise.
Funny that you only hear these types of arguments only when amd is losing in efficiency though.

There is no lack of gaming performance btw, it performs identical to its competitor, the 9950x. Only thing is, it does it at much lower power.
 

JamesJones44

Reputable
Jan 22, 2021
851
779
5,760
There is no lack of gaming performance btw, it performs identical to its competitor, the 9950x. Only thing is, it does it at much lower power.
Who is going to care about $6 to $15 a year (which is likely much less than that) when it costs $100 more than a 14900K and can't beat it in gaming performance? It's on par with a 12900K in many of the gaming benchmarks and is $200 more.

The 285K is simply not a competitive product for gaming, nether is the 9950x for that matter. I don't know why anyone building a game rig would buy the 285K or 9950x when the 7800x3d is the clear efficiency, FPS and cost option.
 

USAFRet

Titan
Moderator
After reading the almost 150 comments in here, all I see is a lot of "Yes, its good" or "No, it sux". Even....'the worstest evar!'
Without a lot to back it up.

Building a new PC in the next couple of months, the Ultra 7 265k at the top of my short list.

Convince me why this is a bad idea or a good idea....

Conditions:
1. I don't really care about no upgrade path for this socket. By the time I need a new/better CPU, I'm changing the whole thing anyway.

2. Not gaming. CAD/photo/video/programming.

3. Probably paired with a 4070 variant and 64GB RAM.


Convince me.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
After reading the almost 150 comments in here, all I see is a lot of "Yes, its good" or "No, it sux". Even....'the worstest evar!'
Without a lot to back it up.

Building a new PC in the next couple of months, the Ultra 7 265k at the top of my short list.

Convince me why this is a bad idea or a good idea....

Conditions:
1. I don't really care about no upgrade path for this socket. By the time I need a new/better CPU, I'm changing the whole thing anyway.

2. Not gaming. CAD/photo/video/programming.

3. Probably paired with a 4070 variant and 64GB RAM.


Convince me.
If you don't care about power draw at all the 13700k would be a better buy at a very similar price point while being decently faster. Other than that, not much to convince you, the 265k is awesome.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Who is going to care about $6 to $15 a year (which is likely much less than that) when it costs $100 more than a 14900K and can't beat it in gaming performance? It's on par with a 12900K in many of the gaming benchmarks and is $200 more.
Who's going to care? The hordes of people crying about how inefficient intel is for the last 2 years probably. They sure seemed to care before. Now that Intel has the crown, I guess it's only 6 to 15$ a year and it doesn't matter, lol.
 
  • Like
Reactions: rluker5

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
The 285K is simply not a competitive product for gaming, nether is the 9950x for that matter. I don't know why anyone building a game rig would buy the 285K or 9950x when the 7800x3d is the clear efficiency, FPS and cost option.
Because I'll see a 0% improvement in my gaming going for the 7800x 3d since my gpu is a huge bottleneck anyways, but I'll notice that the 285k is up to 250% faster on other workloads.
 

rluker5

Distinguished
Jun 23, 2014
901
574
19,760
According to this, the 285K barely edges out the 9950X, but the 245K is no match for the 7000X3D crew.
pBcP4vt9Xxzk6CTAGHRmDL.png
This bodes well for the 9000X3D models.
You can raise the efficiency a lot by reducing the clocks a little.
I would be willing to lose a couple frames in gaming for the gain of efficiency and cooling...
I've got a good tip on doing that for free with your 13700k.
1. copy/paste the options you want into some CMD, powershell, or terminal box from the following list:
powercfg -attributes SUB_PROCESSOR 75b0ae3f-bce0-45a7-8c89-c9611c25e100 -ATTRIB_HIDE (max frequency e-core)
powercfg -attributes SUB_PROCESSOR 75b0ae3f-bce0-45a7-8c89-c9611c25e101 -ATTRIB_HIDE (max frequency p-core)
powercfg -attributes SUB_PROCESSOR bc5038f7-23e0-4960-96da-33abaf5935ec -ATTRIB_HIDE (maximum processor state e-core)
powercfg -attributes SUB_PROCESSOR bc5038f7-23e0-4960-96da-33abaf5935ed -ATTRIB_HIDE (maximum processor state p-core)
powercfg -attributes SUB_PROCESSOR 893dee8e-2bef-41e0-89c6-b55d0929964c -ATTRIB_HIDE (minimum processor state e-core)
powercfg -attributes SUB_PROCESSOR 893dee8e-2bef-41e0-89c6-b55d0929964d -ATTRIB_HIDE (minimum processor state p-core)

but not the part in parentheses as that is just my added description.
They unhide hidden controls in Windows power plans and the max frequency is the most relevant for power savings as it can set the maximum frequency your processor will run at in Windows so long as it is equal or less than your maximum set in bios.
2. Create one or more new power plans and set some maximum frequency that is lower than your max bios frequency. I like to name the plan the same as the max frequency in it. Note that often, and particularly in overclocked systems, the frequency number you type in won't match the frequency the processor will run at so you will just have to load(I like CPUz because ez) and monitor(my favorite is HWinfo) the processor to see which number gets your desired frequency.
For example on my PC for 5.0GHz p-core and 3.8GHz e-core I have to type in 5050MHz for the p-core and 5200MHz for the e-core. And for 4GHz p and 3.1GHz e I have to enter in 4050-p and 4300-e. And for sloppy Windows reasons Windows power plan calls p cores power efficiency class 1 and e cores are just default.
And when you create the clock limits on your power plans, you have to drop both the p cores and e cores to get the voltage to drop properly which is where almost all of your power savings will come from. I like to set my p cores to desired clocks, load the system while monitoring voltage and reduce the e cores until voltage stops going down.
3. When you are playing a game that is not CPU bound, alt tab or Windows button to get to the control panel, choose a power plan with reduced clocks and watch your CPU power consumption plumet.

Here's an old video of me doing this stuff with the 12700k I used to have. Which is slower and less efficient than your 13700k. Edit: But don't use power saver anymore. Windows messed that up so at least I get increased stuttering in games using a power saver based power plan.
View: https://youtu.be/3t4MSa8H5Bg

And if you care, under the spoiler is a video where I tuned that CPU to the minimum power it needed to run 60fps in benchmarks.
 

JamesJones44

Reputable
Jan 22, 2021
851
779
5,760
Who's going to care? The hordes of people crying about how inefficient intel is for the last 2 years probably. They sure seemed to care before. Now that Intel has the crown, I guess it's only 6 to 15$ a year and it doesn't matter, lol.
Then again, I ask, for gaming, why would one pick the 285K over the 7800x3d?
 

JamesJones44

Reputable
Jan 22, 2021
851
779
5,760
Because I'll see a 0% improvement in my gaming going for the 7800x 3d since my gpu is a huge bottleneck anyways, but I'll notice that the 285k is up to 250% faster on other workloads.
The entire original post on this subject was about gaming efficiency for which neither the 285K or 9950x win in any gaming category, so I have no idea what point is trying to be made on that front.
 

Latest posts