• barsoap@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    11 months ago

    Tough luck running any code published by people who put out models, it’s research-grade software in every sense of the word. “Works on my machine” and “the source is the configuration file” kind of thing.

    Get yourself comfyui, they’re always very fast when it comes to supporting new stuff and the thing is generally faster and easier on VRAM than A1111. Prerequisite is a torch (the python package) enabled with CUDA (nvidia) or rocm (AMD) or whatever Intel uses. Fair warning: Getting rocm to run on not officially supported cards is an adventure in itself, I’m still on torch-1.13.1+rocm5.2 newer builds just won’t work as the GPU I’m telling rocm I have so that it runs in the first place supports instructions that my actual GPU doesn’t, and they started using them.