Inference Providers documentation
Text to Image
Text to Image
Generate an image based on a given text prompt.
For more details about the text-to-image
task, check out its dedicated page! You will find examples and related materials.
Recommended models
- black-forest-labs/FLUX.1-dev: One of the most powerful image generation models that can generate realistic outputs.
- Kwai-Kolors/Kolors: Text-to-image model for photorealistic generation.
- stabilityai/stable-diffusion-3-medium-diffusers: A powerful text-to-image model.
Explore all available models and find the one that suits you best here.
Using the API
Copied
from huggingface_hub import InferenceClient
client = InferenceClient(
provider="fal-ai",
api_key="hf_xxxxxxxxxxxxxxxxxxxxxxxx",
)
# output is a PIL.Image object
image = client.text_to_image(
"Astronaut riding a horse",
model="black-forest-labs/FLUX.1-dev",
)
API specification
Request
Headers | ||
---|---|---|
authorization | string | Authentication header in the form 'Bearer: hf_****' when hf_**** is a personal user access token with “Inference Providers” permission. You can generate one from your settings page. |
Payload | ||
---|---|---|
inputs* | string | The input text data (sometimes called “prompt”) |
parameters | object | |
guidance_scale | number | A higher guidance scale value encourages the model to generate images closely linked to the text prompt, but values too high may cause saturation and other artifacts. |
negative_prompt | string | One prompt to guide what NOT to include in image generation. |
num_inference_steps | integer | The number of denoising steps. More denoising steps usually lead to a higher quality image at the expense of slower inference. |
width | integer | The width in pixels of the output image |
height | integer | The height in pixels of the output image |
scheduler | string | Override the scheduler with a compatible one. |
seed | integer | Seed for the random number generator. |
Response
Body | ||
---|---|---|
image | unknown | The generated image returned as raw bytes in the payload. |