Edit model card

🧽 GPT-sponge 🍍

GPT-sponge is a language model based on the GPT-neo 1.3B model and trained on the SpongeBob SquarePants transcripts. It is capable of generating dialogues and scenarios similar to the original cartoon. For more information about how Text Generation functions, please have a look at How πŸ€— Transformers solve tasks.

πŸ€– Model Details

  • Model Name: GPT-sponge 🧽
  • Base Model: GPT-neo 1.3B πŸ€–
  • Training Steps: 10,000 ⏭️ (as for current v2 model)
  • Training Time: ~7 hours (2x NVIDIA A40) ⏰ (as for current v2 model)

πŸ“ Example Outputs

Patrick: Oh, great! Who are they? [it's revealed that the two were phoning each other with jellyfish and Patrick was holding up a bunch of phone books]. πŸ“žπŸ“š

Prompt: "Patrick:"

The episode starts with a view of the houses of Patrick, Squidward, and SpongeBob. The screen zooms in on to show a large pile of sand on the lawn of all three of them. Sandy, who is sitting on top of her rock, is covered in the sand and has a banner of letters that reads "Sandy Wanna blow some bubbles? Only 25 cents." πŸŒŠπŸ–οΈπŸ’°

Prompt: "The episode starts"

⚠️ Disclaimer

This model is intended for entertainment purposes only and should not be used for any commercial or business purposes. The output of the model may contain errors or offensive content, and I am not responsible for any consequences arising from the use of the model.

Downloads last month
10
Safetensors
Model size
1.42B params
Tensor type
F32
Β·
U8
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train krplt/GPT-Sponge