News World's first 'thermodynamic computing chip' reaches tape out — Normal Computing's physics-based ASIC changes lanes to train more AI

Interesting, but I'd want to know a lot more about how many terms these systems can have, what the setup and typical convergence times are, what tools exist for programming or using them, and what libraries or emulators exist for experimenting with the approach.

The beauty of quantum is that entanglement is distance-invariant. Suggesting this relies on thermodynamics implies it's using some sort of entropy-based method and it's not obvious to me how you can scale that up.
 
Interesting, but I'd want to know a lot more about how many terms these systems can have, what the setup and typical convergence times are, what tools exist for programming or using them, and what libraries or emulators exist for experimenting with the approach.

The beauty of quantum is that entanglement is distance-invariant. Suggesting this relies on thermodynamics implies it's using some sort of entropy-based method and it's not obvious to me how you can scale that up.
It's specifically for image and video AI generation. When an AI alters an image the first step is to introduce noise which it progressively filters out and replaced based on the parameters it's given, which model and algorithm it uses, and it's training data.

Where this would really come in handy is now that I think about it is when it's assimilating the training data. That process works a bit like the inverse of my previous example except it would start with an image and progressively add noise. You can imagine that using GPUs and VRAM for this is wasteful to say the least.

I hope that made some sense I'm just starting to get my head around all of this gobbledygook.