hdallatorre
commited on
Commit
•
22490da
1
Parent(s):
22e3875
Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ tags:
|
|
11 |
|
12 |
The Nucleotide Transformers are a collection of foundational language models that were pre-trained on DNA sequences from whole-genomes. Compared to other approaches, our models do not only integrate information from single reference genomes, but leverage DNA sequences from over 3,200 diverse human genomes, as well as 850 genomes from a wide range of species, including model and non-model organisms. Through robust and extensive evaluation, we show that these large models provide extremely accurate molecular phenotype prediction compared to existing methods
|
13 |
|
14 |
-
Part of this collection is the **nucleotide-transformer-2.5b-1000g**, a 2.5B parameters transformer pre-trained on a collection of 3202 genetically diverse human genomes.
|
15 |
|
16 |
**Developed by:** InstaDeep, NVIDIA and TUM
|
17 |
|
@@ -25,6 +25,12 @@ Part of this collection is the **nucleotide-transformer-2.5b-1000g**, a 2.5B par
|
|
25 |
### How to use
|
26 |
|
27 |
<!-- Need to adapt this section to our model. Need to figure out how to load the models from huggingface and do inference on them -->
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
```python
|
29 |
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
30 |
import torch
|
|
|
11 |
|
12 |
The Nucleotide Transformers are a collection of foundational language models that were pre-trained on DNA sequences from whole-genomes. Compared to other approaches, our models do not only integrate information from single reference genomes, but leverage DNA sequences from over 3,200 diverse human genomes, as well as 850 genomes from a wide range of species, including model and non-model organisms. Through robust and extensive evaluation, we show that these large models provide extremely accurate molecular phenotype prediction compared to existing methods
|
13 |
|
14 |
+
Part of this collection is the **nucleotide-transformer-2.5b-1000g**, a 2.5B parameters transformer pre-trained on a collection of 3202 genetically diverse human genomes. The model is made available both in Tensorflow and Pytorch.
|
15 |
|
16 |
**Developed by:** InstaDeep, NVIDIA and TUM
|
17 |
|
|
|
25 |
### How to use
|
26 |
|
27 |
<!-- Need to adapt this section to our model. Need to figure out how to load the models from huggingface and do inference on them -->
|
28 |
+
Until its next release, the `transformers` library needs to be installed from source with the following command in order to use the models:
|
29 |
+
```bash
|
30 |
+
pip install --upgrade git+https://github.com/huggingface/transformers.git
|
31 |
+
```
|
32 |
+
|
33 |
+
A small snippet of code is given here in order to retrieve both logits and embeddings from a dummy DNA sequence.
|
34 |
```python
|
35 |
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
36 |
import torch
|