--- base_model: openai/gpt2 language: - en license: mit tags: - text-generation-inference - gpt2 --- # Uploaded model - **Developed by:** edg3 - **License:** mit - **Finetuned from model :** openai/gpt2 This model is trained on the [documentation of Godot 4.3](https://docs.godotengine.org/en/stable/index.html) as of 2024-10-07. It has gotten to 62k/100k steps on the first training, at 0.460700 training loss. It completed training with V2 adjustments, down below 0.28 training loss, approx. The plan: Train this model more using the dataset I've organised to make a GPT style wiki for the Godot game engine. GPT2 is under a [modified MIT license](https://github.com/openai/gpt-2/blob/master/LICENSE). **Version Notes:** - V1: In need of tons more time training - V2: Low percent section of answers are closer; higher percentage of answers add higher inaccuracy on the answers.