README / README.md
mrfakename's picture
Fix minor typo
3efdf07
|
raw
history blame
1.95 kB
---
title: README
emoji: 🌍
colorFrom: indigo
colorTo: indigo
sdk: static
pinned: false
---
# ZeroGPU Spaces
<img src="https://cdn-uploads.huggingface.co/production/uploads/5f17f0a0925b9863e28ad517/0IX8YyC6P5stLQfpLp8ns.gif" style="width:100%;"/>
*ZeroGPU* is a new kind of hardware for Spaces.
It has two goals :
- Provide **free GPU access** for Spaces
- Allow Spaces to run on **multiple GPUs**
This is achieved by making Spaces efficiently hold and release GPUs as needed
(as opposed to a classical GPU Space that holds exactly one GPU at any point in time)
# Compatibility
*ZeroGPU* Spaces should mostly be compatible with any PyTorch-based GPU Space.<br>
Compatibilty with high level HF libraries like `transformers` or `diffusers` is slightly more guaranteed<br>
That said, ZeroGPU Spaces are not as broadly compatible as classical GPU Spaces and you might still encounter unexpected bugs
Also, for now, ZeroGPU Spaces only works with the **Gradio SDK**
# Usage
In order to make your Space work with ZeroGPU you need to **decorate** the Python functions that actually require a GPU with `@spaces.GPU`<br>
During the time when a decorated function is invoked, the Space will be attributed a GPU, and it will release it upon completion of the function.<br>
Here is a practical example :
```diff
+import spaces
from diffusers import DiffusionPipeline
pipe = DiffusionPipeline.from_pretrained(...)
pipe.to('cuda')
+@spaces.GPU
def generate(prompt):
return pipe(prompt).images
gr.Interface(
fn=generate,
inputs=gr.Text(),
outputs=gr.Gallery(),
).launch()
```
1. We first `import spaces` (importing it first might prevent some issues but is not mandatory)
2. Then we decorate the `generate` function by adding a `@spaces.GPU` line before its definition
# Early access
Feel free to join this organization if you want to try ZeroGPU as a Space author. βœ‹ We should accept you shortly after checking your HF profile