gpt2-brainrot is a version of gpt2 fine-tuned on data I pulled from r/brainrot and r/skibiditoilet. The purpose of it is to generate ridiculous stuff with internet language such as:
- skibidi
- gyat
- hawk tuah
- fanum tax
(that's obviously not all of it, but just some examples).
The model, at 124,442,112 parameters, has about 7.4 million more parameters than the original version of gpt2.
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for real-chatgpt/gpt2-brainrot
Base model
openai-community/gpt2