soniajoseph commited on
Commit
71ec615
·
verified ·
1 Parent(s): b839102

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CLIP Sparse Autoencoder Checkpoint
2
+
3
+ This model is a sparse autoencoder trained on CLIP's internal representations.
4
+
5
+ ## Model Details
6
+
7
+ ### Architecture
8
+ - **Layer**: 10
9
+ - **Layer Type**: hook_mlp_out
10
+ - **Model**: open-clip:laion/CLIP-ViT-B-32-DataComp.XL-s13B-b90K
11
+ - **Dictionary Size**: 49152
12
+ - **Input Dimension**: 768
13
+ - **Expansion Factor**: 64
14
+ - **CLS Token Only**: True
15
+
16
+ ### Training
17
+ - **Training Images**: 114683904
18
+ - **Learning Rate**: 0.0002
19
+ - **L1 Coefficient**: 0.3000
20
+ - **Batch Size**: 4096
21
+ - **Context Size**: 1
22
+
23
+ ## Performance Metrics
24
+
25
+ ### Sparsity
26
+ - **L0 (Active Features)**: 64
27
+ - **Dead Features**: 4
28
+ - **Mean Log10 Feature Sparsity**: -3.9917
29
+ - **Features Below 1e-5**: 2607
30
+ - **Features Below 1e-6**: 251
31
+ - **Mean Passes Since Fired**: 72.9733
32
+
33
+ ### Reconstruction
34
+ - **Explained Variance**: 0.8889
35
+ - **Explained Variance Std**: 0.0565
36
+ - **MSE Loss**: 0.0007
37
+ - **L1 Loss**: 0
38
+ - **Overall Loss**: 0.0007
39
+
40
+ ## Training Details
41
+ - **Training Duration**: 17909.1347 seconds
42
+ - **Final Learning Rate**: 0.0002
43
+ - **Warm Up Steps**: 200
44
+ - **Gradient Clipping**: 1
45
+
46
+ ## Additional Information
47
+ - **Weights & Biases Run**: https://wandb.ai/perceptual-alignment/clip/runs/vzh0cikg
48
+ - **Original Checkpoint Path**: /network/scratch/s/sonia.joseph/checkpoints/clip-b
49
+ - **Random Seed**: 42