soniajoseph commited on
Commit
73611b7
1 Parent(s): eaf537c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CLIP Sparse Autoencoder Checkpoint
2
+
3
+ This model is a sparse autoencoder trained on CLIP's internal representations.
4
+
5
+ ## Model Details
6
+
7
+ ### Architecture
8
+ - **Layer**: 11
9
+ - **Layer Type**: hook_resid_post
10
+ - **Model**: open-clip:laion/CLIP-ViT-B-32-DataComp.XL-s13B-b90K
11
+ - **Dictionary Size**: 49152
12
+ - **Input Dimension**: 768
13
+ - **Expansion Factor**: 64
14
+ - **CLS Token Only**: True
15
+
16
+ ### Training
17
+ - **Training Images**: 110178304
18
+ - **Learning Rate**: 0.0002
19
+ - **L1 Coefficient**: 0.3000
20
+ - **Batch Size**: 4096
21
+ - **Context Size**: 1
22
+
23
+ ## Performance Metrics
24
+
25
+ ### Sparsity
26
+ - **L0 (Active Features)**: 64
27
+ - **Dead Features**: 0
28
+ - **Mean Log10 Feature Sparsity**: -3.4080
29
+ - **Features Below 1e-5**: 10
30
+ - **Features Below 1e-6**: 0
31
+ - **Mean Passes Since Fired**: 13.0446
32
+
33
+ ### Reconstruction
34
+ - **Explained Variance**: 0.8423
35
+ - **Explained Variance Std**: 0.0443
36
+ - **MSE Loss**: 0.0025
37
+ - **L1 Loss**: 0
38
+ - **Overall Loss**: 0.0025
39
+
40
+ ## Training Details
41
+ - **Training Duration**: 17866.3376 seconds
42
+ - **Final Learning Rate**: 0.0002
43
+ - **Warm Up Steps**: 200
44
+ - **Gradient Clipping**: 1
45
+
46
+ ## Additional Information
47
+ - **Weights & Biases Run**: https://wandb.ai/perceptual-alignment/clip/runs/b5q0wr11
48
+ - **Original Checkpoint Path**: /network/scratch/s/sonia.joseph/checkpoints/clip-b
49
+ - **Random Seed**: 42