News Leakers suggest AMD Strix Halo reviews dropping tomorrow — Asus ROG Flow Z13 launches February 25

I expected $3k for anything with the full 128 GB of memory, so $2,700 is under that at least. That's the amount for AI.

I think the 64 GB models could suffer in the market because it's not enough memory for the LLMs people want to run.

Then when you get down to 32 GB, 8-core, 32 CUs, pricing will be everything for determining whether it matters compared to Nvidia dGPU gaming laptops, or other mini PC options, wherever it ends up. That is enough for gaming. You could split it 20 GB RAM / 12 GB "VRAM".

I remember there is at least one tablet with Strix Halo in it, that should be interesting.
 
Crazy it has taken AMD this long to make a purchasable consumer grade APU like this when they have been doing it for the last several console generations.

Hoping to scoop up a mini PC for something that isnt stupid expensive.
Barebones mini PC for under $1k would be great.
 
I expected $3k for anything with the full 128 GB of memory, so $2,700 is under that at least. That's the amount for AI.

I think the 64 GB models could suffer in the market because it's not enough memory for the LLMs people want to run.

Then when you get down to 32 GB, 8-core, 32 CUs, pricing will be everything for determining whether it matters compared to Nvidia dGPU gaming laptops, or other mini PC options, wherever it ends up. That is enough for gaming. You could split it 20 GB RAM / 12 GB "VRAM".

I remember there is at least one tablet with Strix Halo in it, that should be interesting.
Nah, 64 gigs is the sweet spot. Heck, if the pooled memory works the same way as Apple Silicon, where the CPU and GPU not only dynamically allocate memory but also share direct access to the same memory addresses, then even 32 is going to be enough. This avoids the inefficiency of copying model data between separate memory pools, allowing LLM inference to run way faster since transfer speed is often a bottleneck.

If you want to run a larger model than that, you're going to get unusably low tok/sec anyways, making it kinda pointless. Scaling memory without scaling compute power is unbalanced.
 
  • Like
Reactions: ottonis
Crazy it has taken AMD this long to make a purchasable consumer grade APU like this when they have been doing it for the last several console generations.

Hoping to scoop up a mini PC for something that isnt stupid expensive.
Barebones mini PC for under $1k would be great.
In an interview, the product lead for this chip said that he's been pushing this type of product for a while, but senior management thought it wouldn't be viable (AMD has enough trouble selling any GPUs, let alone laptop GPUs) until Apple came along and showed it was possible for consumer markets.
 
Probably $6-7K in Australia for any high end 395 Max systems. Prices like it's 2000 again. All for a non-upgradeable laptop that'll seem like slow junk in 3 years.
 
Hoping to scoop up a mini PC for something that isnt stupid expensive.
Barebones mini PC for under $1k would be great.
Weren't some of the first Strix Point 370 mini PCs $1k? It will take a long while for Strix Halo to get that cheap.

Also, it seems impossible to get it truly "barebones" if it always comes with soldered memory (no LPCAMM options seen).
 
The previous models from 2023 (GZ301) were an i9-13900H with RTX 2050/4050/4060/4070
Pricing was more or less the same.

The $2700 395+ 128GB version seems overkill, unless you absolutely need it.
The $2200 395+ 32GB version is the sweet spot. Set it to 12GB GPU / 24GB CPU split and it should handle 1080p (or 1440p upscaled) stuff just fine I think.

As for other 395+ products, Sixunited is showing off their AXB35-02 mini-PC equipped with a 395+. (It seems to be a little larger than an ATX PSU)
 
Sounds good for anything inference-related. That being said, if training of LLMs (or even finetuning) is a goal, then 128 GB would probably not exactly be the most feasible amount of DRAM for that kind of task. Then again, one can run inference tasks even on much less than 128 GB of RAM.
So, I wonder what type of usecase these new APU-based designs are exactely aiming at.
If someone wants to game, there are probably specific gaming notebooks with higher GPU performance. AI training is probably also outside the ballpark.
In my opinion, such a machine would probably most closely resemble the strengths of an Apple Silicon based Apple M1-4 APU which is extremely energy-efficient at video-/graphics/multimedia/content creation tasks.
 
Crazy it has taken AMD this long to make a purchasable consumer grade APU like this when they have been doing it for the last several console generations.

Hoping to scoop up a mini PC for something that isnt stupid expensive.
Barebones mini PC for under $1k would be great.
What are you dreaming of at night? xD
Seriously, there is no way these things would start under 2k. If you wanna be real charitable, make that 1.5k, but under that? Lol, nope. 4070 laptops cost over 1k already, and very often 4060 laptops do, too. Even as a barebones, this won't happen any time soon.
 
  • Like
Reactions: kealii123
In an interview, the product lead for this chip said that he's been pushing this type of product for a while, but senior management thought it wouldn't be viable (AMD has enough trouble selling any GPUs, let alone laptop GPUs) until Apple came along and showed it was possible for consumer markets.
Just to easy my linux entry i would get this. No more messing around with GPU drivers, from either AMD or Nvidia.

I looked at getting previous IGP chips but the GPU side of the A class was truly pitiful and a waste of time money and effort. If i am going to do the IGP route on a APU type chip i want to be able to play in more than 1024x768.

These new APU's on the other hand seem to be capable.

Now that chiplets are here and can be attached to other chips in the CCD, i truly think that the discreet GPU is on its way out.
 
  • Like
Reactions: kealii123

Latest posts