gpt2-skript-1m-v5 / README.md
johnpaulbin's picture
Create README.md
9ca22bf
|
raw
history blame
281 Bytes

GPT-2 for Skript

Complete your Skript automatically via a finetuned GPT-2 model

0.57 Training loss on about 2 epochs (in total)

1.2 million lines of Skript is inside the dataset.

Inference Colab: https://colab.research.google.com/drive/1ujtLt7MOk7Nsag3q-BYK62Kpoe4Lr4PE