>That said, bandwidth will be lacking on many of these PCs, including the Qualcomm ones. For large language models, and other tasks that rely on transformer or diffusion models, the 128-bit buses are going to hurt far more than whether or not the processing units are lacking in TOPS.
You're referring to LLMs as we know them today, not the models AI vendors are prepping for on-device use.
All players involved, incl MS, Apple (rumored), Google, and Meta are working small AI models for devices. These obviously don't have the breadth of general knowledge that LLMs have, but hopefully they can provide depth of domain-specific knowledge. Anyway, there'll be no talk of requiring 4090-level compute or bandwidth for these.
Win11 24H2 will just be a teaser, gauging from the vagueness of promised AI features from the MS presentation. The real launch will come with Win12, in '25, which not coincidentally is when the large wave of AI PCs will hit.
>To me, I am only interested to see how well the Snapdragon chip works.
Ditto. I'd like to see ARM do well in Windows-land, not because I intend to buy one any time soon, but because that competition will raise all boats.
Windows has been stuck in a rut for some 20-odd years. MS has tried many a times, in its endearingly clumsy way, to get out of this rut, to no success. Hopefully, this umpteenth time's the charm...and not just because I own MS stock!