Minimum requirements for inference ?

#25
by Simon-arctoon - opened

I'm trying to run inference locally using https://github.com/black-forest-labs/flux?tab=readme-ov-file.
On a single A100 (48gb), I get OOM.

I didnt find anywhere any recommended specs; any help required :)

Sign up or log in to comment