File size: 1,677 Bytes
7c84d2d
f62f10f
 
 
618ed26
 
 
 
7c84d2d
f62f10f
618ed26
f62f10f
13c078f
f62f10f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5dd8401
 
f62f10f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
618ed26
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
license: apache-2.0
tags:
- pretrained
- mistral
- DNA
- biology
- genomics
---

# Model Card for Mistral-DNA-v0.2 (mistral for DNA)

The Mistral-DNA-v0.2 Large Language Model (LLM) is a pretrained generative DNA text model with 17.31M parameters x 8 experts = 138.5M parameters. 
It is derived from Mistral-7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced. 
The model was pretrained using the human genome hg38 with 10kb DNA sequences. 

For full details of this model please read our [github repo](https://github.com/raphaelmourad/Mistral-DNA).

## Model Architecture

Like Mistral-7B-v0.1, it is a transformer model, with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer

## Load the model from huggingface:

```
import torch
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("RaphaelMourad/Mistral-DNA-v0.2", trust_remote_code=True) # Same as DNABERT2
model = AutoModel.from_pretrained("RaphaelMourad/Mistral-DNA-v0.2", trust_remote_code=True)
```

## Calculate the embedding of a DNA sequence

```
dna = "TGATGATTGGCGCGGCTAGGATCGGCT"
inputs = tokenizer(dna, return_tensors = 'pt')["input_ids"]
hidden_states = model(inputs)[0] # [1, sequence_length, 256]

# embedding with max pooling
embedding_max = torch.max(hidden_states[0], dim=0)[0]
print(embedding_max.shape) # expect to be 256
```

## Troubleshooting

Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer.

## Notice

Mistral-DNA is a pretrained base model for DNA.

## Contact
 
Raphaël Mourad. raphael.mourad@univ-tlse3.fr