John Smith's picture

John Smith PRO

John6666

AI & ML interests

None yet

Recent Activity

Organizations

open/ acc's profile picture FashionStash Group meeting's profile picture

Posts 3

view post
Post
12108
@victor @not-lain There has been a sudden and unusual outbreak of spam postings on the HF Forum that seem to be aimed at relaying online videos and commenting on them. It is also spanning multiple languages for some reason. I've flagged it too, but I'm not sure if the staff will be able to keep up with the manual measures in the future.
view post
Post
12756
@victor Sorry for the repetitiveness.

I'm not sure if Post is the right place to report such an error, but it seems to be a server error unrelated to the Zero GPU space error the other day, so I don't know where else to report it.

Since this morning, I have been getting a strange error when running inference from space in Gradio 3.x.
Yntec (https://huggingface.co/Yntec) discovered it, but he is not in the Pro subscription, so I am reporting it on behalf of him.

The error message is as follows: 1girl and other prompts will show cached output, so experiment with unusual prompts.

Thank you in advance.

John6666/blitz_diffusion_error
John6666/GPU-stresser-t2i-error
ValueError: Could not complete request to HuggingFace API, Status Code: 500, Error: unknown error, Warnings: ['CUDA out of memory. Tried to allocate 30.00 MiB (GPU 0; 14.75 GiB total capacity; 1.90 GiB already allocated; 3.06 MiB free; 1.95 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF', 'There was an inference error: CUDA out of memory. Tried to allocate 30.00 MiB (GPU 0; 14.75 GiB total capacity; 1.90 GiB already allocated; 3.06 MiB free; 1.95 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF']