File size: 679 Bytes
95a238a
fa3e343
95a238a
fa3e343
95a238a
fa3e343
95a238a
fa3e343
95a238a
 
 
fa3e343
 
95a238a
fa3e343
 
95a238a
fa3e343
 
95a238a
fa3e343
95a238a
fa3e343
95a238a
fa3e343
95a238a
bd2e991
 
 
 
95a238a
fa3e343
95a238a
fa3e343
 
 
bd2e991
fa3e343
95a238a
fa3e343
95a238a
fa3e343
 
 
 
 
 
 
 
95a238a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51

---

base_model: westlake-repl/SaProt_35M_AF2 

library_name: peft

---



# Model Card for Model-demo-35M 
This model is used for a demo classification task

## Task type
Protein-level Classification

## Model input type
SA Sequence

## Label meanings

0: A

1: B

2: C

3: D


## LoRA config

- **r:** 8
- **lora_dropout:** 0.0
- **lora_alpha:** 16
- **target_modules:** ['query', 'intermediate.dense', 'key', 'value', 'output.dense']
- **modules_to_save:** ['classifier']

## Training config

- **optimizer:**
  - **class:** AdamW
  - **betas:** (0.9, 0.98)
  - **weight_decay:** 0.01
- **learning rate:** 0.001
- **epoch:** 1
- **batch size:** 2
- **precision:** 16-mixed