File size: 2,555 Bytes
2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf bf1c57e 2e605bf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 |
# Model documentation & parameters
**Language model**: Type of language model to be used.
**Prefix**: Task specific prefix for task definition (see the provided examples for specific tasks).
**Text prompt**: The text input of the model.
**Num beams**: Number of beams to be used for the text generation.
# Model card -- PatentGenerativeTransformer
**Model Details**: Text+chem T5 : a multi-domain, multi-task language model to solve a wide range of tasks in both the chemical and natural language domains. Published by [Christofidellis et al.](https://arxiv.org/pdf/2301.12586.pdf)
**Developers**: Dimitrios Christofidellis, Giorgio Giannone, Jannis Born and Matteo Manica from IBM Research and Ole Winther from Technical University of Denmark.
**Distributors**: Model natively integrated into GT4SD.
**Model date**: 2022.
**Model type**: A Transformer-based language model that is trained on a multi-domain and a multi-task dataset by aggregating available datasets
for the tasks of Forward reaction prediction, Retrosynthesis, Molecular captioning, Text-conditional de novo generation and Paragraph to actions.
**Information about training algorithms, parameters, fairness constraints or other applied approaches, and features**:
N.A.
**Paper or other resource for more information**:
The Text+chem T5 [Christofidellis et al.](https://arxiv.org/pdf/2301.12586.pdf)
**License**: MIT
**Where to send questions or comments about the model**: Open an issue on [GT4SD repository](https://github.com/GT4SD/gt4sd-core).
**Intended Use. Use cases that were envisioned during development**: N.A.
**Primary intended uses/users**: N.A.
**Out-of-scope use cases**: Production-level inference, producing molecules with harmful properties.
**Metrics**: N.A.
**Datasets**: N.A.
**Ethical Considerations**: Unclear, please consult with original authors in case of questions.
**Caveats and Recommendations**: Unclear, please consult with original authors in case of questions.
Model card prototype inspired by [Mitchell et al. (2019)](https://dl.acm.org/doi/abs/10.1145/3287560.3287596?casa_token=XD4eHiE2cRUAAAAA:NL11gMa1hGPOUKTAbtXnbVQBDBbjxwcjGECF_i-WC_3g1aBgU1Hbz_f2b4kI_m1in-w__1ztGeHnwHs)
## Citation
```bib
@article{christofidellis2023unifying,
title={Unifying Molecular and Textual Representations via Multi-task Language Modelling},
author={Christofidellis, Dimitrios and Giannone, Giorgio and Born, Jannis and Winther, Ole and Laino, Teodoro and Manica, Matteo},
journal={arXiv preprint arXiv:2301.12586},
year={2023}
}
``` |