Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ license: apache-2.0
|
|
8 |
|
9 |
This protein language model is an autoregressive transformer model in GPT-style, trained to analyze and predict the mechanical properties of a large number of protein sequences.
|
10 |
|
11 |
-
This protein language foundation model was based on the NeoGPT-X architecture and uses rotary positional embeddings (RoPE). It has 16 attention heads, 36 hidden layers and a hidden size of 1024, an intermediate size of
|
12 |
|
13 |
The pretraining task is defined as "Sequence<...>" where ... is an amino acid sequence.
|
14 |
|
|
|
8 |
|
9 |
This protein language model is an autoregressive transformer model in GPT-style, trained to analyze and predict the mechanical properties of a large number of protein sequences.
|
10 |
|
11 |
+
This protein language foundation model was based on the NeoGPT-X architecture and uses rotary positional embeddings (RoPE). It has 16 attention heads, 36 hidden layers and a hidden size of 1024, an intermediate size of 4096 and uses a GeLU activation function.
|
12 |
|
13 |
The pretraining task is defined as "Sequence<...>" where ... is an amino acid sequence.
|
14 |
|