Edit model card

The following model is trained on entirely historical data up to the cutoff date "31-12-2012". The training data comes from the WMT News dataset (https://data.statmt.org/news-crawl/en/) and Wikipedia. The exact training dataset for this model is available on Huggingface at the following location: "TiMa/TiMaGPT2-2012".

Please refer to and cite the following paper when using this model in any downstream applications:

@inproceedings{drinkall-tima-2024, title = "Time Machine GPT", author = "Drinkall, Felix and Zohren, Stefan and Pierrehumbert, Janet", booktitle = "Findings of the Association for Computational Linguistics: NAACL 2024", month = june, year = "2024", publisher = "Association for Computational Linguistics" }

Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.