Edit model card

This Repo contains a diffusers format version of the PixArt-Sigma Repos PixArt-alpha/pixart_sigma_sdxlvae_T5_diffusers PixArt-alpha/PixArt-Sigma-XL-2-1024-MS with the models loaded and saved in fp16 and bf16 formats, roughly halfing their sizes. It can be used where download bandwith, memory or diskspace are relatively low, a T4 Colab instance for example.

To use in a diffusers script you currently(15/04/2024) need to use a Source distribution of Diffusers and an extra 'patch' from the PixArt0Alpha's teams Sigma Github repo

A simple Colab notebook can be found at https://github.com/Vargol/StableDiffusionColabs/blob/main/PixArt/PixArt_Sigma.ipynb

a Diffusers script looks like this.

import random
import sys
import torch
from diffusers import Transformer2DModel
from scripts.diffusers_patches import pixart_sigma_init_patched_inputs, PixArtSigmaPipeline

assert getattr(Transformer2DModel, '_init_patched_inputs', False), "Need to Upgrade diffusers: pip install git+https://github.com/huggingface/diffusers"
setattr(Transformer2DModel, '_init_patched_inputs', pixart_sigma_init_patched_inputs)
device = 'mps'
weight_dtype = torch.bfloat16

pipe = PixArtSigmaPipeline.from_pretrained(
    "/Vargol/PixArt-Sigma_16bit",
    torch_dtype=weight_dtype,
    variant="fp16",
    use_safetensors=True,
)

# Enable memory optimizations.
# pipe.enable_model_cpu_offload()
pipe.to(device)

prompt = "Cinematic science fiction film still.A cybernetic demon awaits her friend in a bar selling flaming oil drinks.  The barman is a huge tree being, towering over the demon"

for i in range(4):

    seed = random.randint(0, sys.maxsize)
    generator =  torch.Generator("mps").manual_seed(seed);

    image = pipe(prompt, generator=generator, num_iferencenum_inference_steps=40).images[0]
    image.save(f"pas_{seed}.png")a
Downloads last month
20
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.