This uses the Hunyuan architecture under the hood, by the way. If anyone is curious enough to try this one out, I highly suggest you check out the "issues" tab in the GitHub and pick up the one click sage attention install script, you could see a 30% speed improvement if you do.
Even so, the...