Categories: Fun

alibaba-pai/Z-Image-Fun-Lora-Distill · Hugging Face

This web page was created programmatically, to learn the article in its unique location you may go to the hyperlink bellow:
https://huggingface.co/alibaba-pai/Z-Image-Fun-Lora-Distill
and if you wish to take away this text from our website please contact us






Model Card

Name Description
Z-Image-Fun-Lora-Distill-8-Steps.safetensors This is a Distill LoRA for Z-Image that distills each steps and CFG. This mannequin doesn’t require CFG and makes use of 8 steps for inference.





Model Features

  • This is a Distill LoRA for Z-Image that distills each steps and CFG. It doesn’t use any Z-Image-Turbo associated weights and is educated from scratch. It is appropriate with different Z-Image LoRAs and Controls.
  • This mannequin will barely scale back the output high quality and alter the output composition of the mannequin. For particular comparisons, please discuss with the Results part. In most circumstances, the Distill LoRA performs effectively; at present, the most important subject is that it could make the generated outcomes brighter.
  • The objective of this mannequin is to supply quick era compatibility for Z-Image by-product fashions, to not change Z-Image-Turbo.





TODO

  • Optimize the output brightness;
  • Train a 4-step LoRA.





Results





Work itself

Output 25 steps Output 8 steps
Output 25 steps Output 8 steps
Output 25 steps Output 8 steps
Output 25 steps Output 8 steps





Work with Controlnet

Pose + Inpaint Output 25 steps Output 8 steps
Pose + Inpaint Output 25 steps Output 8 steps
Pose Output 25 steps Output 8 steps
Pose Output 25 steps Output 8 steps
Canny Output Output 8 steps
Depth Output Output 8 steps





Inference

Go to the VideoX-Fun repository for extra particulars.

Please clone the VideoX-Fun repository and create the required directories:

# Clone the code
git clone 

# Enter VideoX-Fun's listing
cd VideoX-Fun

# Create mannequin directories
mkdir -p fashions/Diffusion_Transformer
mkdir -p fashions/Personalized_Model

Then obtain the weights into fashions/Diffusion_Transformer and fashions/Personalized_Model.

📦 fashions/
├── 📂 Diffusion_Transformer/
│   └── 📂 Z-Image/
├── 📂 Personalized_Model/
│   ├── 📦 Z-Image-Fun-Lora-Distill-8-Steps.safetensors
│   ├── 📦 Z-Image-Fun-Controlnet-Union-2.1.safetensors
│   └── 📦 Z-Image-Fun-Controlnet-Union-2.1-lite.safetensors

Set the lora_path=”Personalized_Model/Z-Image-Fun-Lora-Distill-8-Steps.safetensors” in examples/z_image_fun/predict_t2i_control_2.1.py and examples/z_image_fun/predict_i2i_inpaint_2.1.py

Then run the file examples/z_image_fun/predict_t2i_control_2.1.py and examples/z_image_fun/predict_i2i_inpaint_2.1.py.


This web page was created programmatically, to learn the article in its unique location you may go to the hyperlink bellow:
https://huggingface.co/alibaba-pai/Z-Image-Fun-Lora-Distill
and if you wish to take away this text from our website please contact us

fooshya

Share
Published by
fooshya

Recent Posts

Jansen & Greene Garner SEC Weekly Honors

This web page was created programmatically, to learn the article in its unique location you'll…

5 minutes ago

Uncover Norway: Luxury Travel Information

This web page was created programmatically, to learn the article in its authentic location you…

8 minutes ago

Parenting within the gadget period: Shaping minds past screens

This web page was created programmatically, to learn the article in its authentic location you…

12 minutes ago

Native gaming enterprise house owners converse on relationships with Hasbro in wake of lawsuit

This web page was created programmatically, to learn the article in its unique location you…

18 minutes ago

Gaming apps will be gateways for predators

This web page was created programmatically, to learn the article in its unique location you…

19 minutes ago

Be taught to hunt wildlife with a digital camera at Busch Shooting Range’s Outdoor Photography class Feb. 17

This web page was created programmatically, to learn the article in its unique location you…

24 minutes ago