Add link to paper

#3
by nielsr HF staff - opened
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -17,7 +17,7 @@ GottBERT is the first German-only RoBERTa model, pre-trained on the German porti
17
  - **Large Model**: 24 layers, 355 million parameters
18
  - **License**: MIT
19
 
20
- ---
21
 
22
  ## Pretraining Details
23
 
 
17
  - **Large Model**: 24 layers, 355 million parameters
18
  - **License**: MIT
19
 
20
+ This was presented in [GottBERT: a pure German Language Model](https://huggingface.co/papers/2012.02110).
21
 
22
  ## Pretraining Details
23