File size: 2,397 Bytes
a530912 13955f0 85b923b 80c57a7 412ec84 a530912 dadf017 85b923b dadf017 babc95f dadf017 4471303 babc95f dadf017 85b923b dadf017 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 |
---
language: english
tags:
- t5
widget:
- text: "dekingify: "
example_title: "Translate 17th-century English to modern English"
- text: "kingify: "
example_title: "Translate modern English to 17th-century English"
---
# Kingify 2Way
This is a custom AI model that translates modern English into 17th-century English or "King James" English.
## Details of the model
This model is a fine-tuned version of [google/t5-v1_1-large] on a dataset of a modern Bible translation with matching King James Bible verses.
## Intended uses & limitations
At times, despite sharing the same language and general grammatical rules, English from previous centuries can be easily misunderstood. The purpose of this was to explore ways to understand texts from the 17th-century more clearly.
#### How to use
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("swcrazyfan/Kingify-2Way")
model = AutoModelWithLMHead.from_pretrained("swcrazyfan/Kingify-2Way")
```
#### Limitations and bias
- The model is trained on the King James Version of the Bible, so it will work best with Christian-style language (or even clichés).
- Before the 18th and 19th centuries, English spelling was inconsistent. Because of this, the model often does not recognize spellings different from those in the KJV.
- The model was trained on a relatively small amount of data, so it will not be as accurate as a model trained on a larger data set.
## Training data
The data used to train this model is from the New English Translation and the King James Version of the Bible.
## Training procedure
The model was trained on Kaggle using the Hugging Face Transformers library.
### Training hyperparameters
The following hyperparameters were used during training:
- num_train_epochs: 4
- learning_rate: 5e-04
- train_batch_size: 2
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
## Eval results
The model was evaluated using a human test. A human was asked to evaluate the translation quality of the model. The human was not told which sentences were translated by the model and which sentences were written by a human.
## BibTeX entry and citation info
```bibtex
@inproceedings{,
title={Kingify 2Way},
author={Joshua Kaufmann},
year={2022},
url={https://huggingface.co/swcrazyfan/Kingify-2Way-T5-Large-v1_1}
}
``` |