• E-Reverance an hour ago

    Post it on r/StableDiffusion

    • streamer45 5 hours ago

      Rad! huggingface link gives 404 on my side though.

      • schopra909 5 hours ago

        Oh damn! Thanks for catching that -- going to ping the HF folks to see what they can do to fix the collection link.

        In the meantime here's the individual links to the models:

        https://huggingface.co/Linum-AI/linum-v2-720p https://huggingface.co/Linum-AI/linum-v2-360p

        • streamer45 4 hours ago

          Looks like 20GB VRAM isn't enough for the 360p demo :( need to bump my specs :sweat_smile:

          • schopra909 5 hours ago

            Should be fixed now! Thanks again for the heads up

            • streamer45 5 hours ago

              All good, cheers!

              • schopra909 4 hours ago

                Per the RAM comment, you may able to get it run locally with two tweaks:

                https://github.com/Linum-AI/linum-v2/blob/298b1bb9186b5b9ff6...

                1) Free up the t5 as soon as the text is encoded, so you reclaim GPU RAM

                2) Manual Layer Offloading; move layers off GPU once they're done being used to free up space for the remaining layers + activations