soniajoseph commited on
Commit
5fc8afd
1 Parent(s): f04fb80

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CLIP Sparse Autoencoder Checkpoint
2
+
3
+ This model is a sparse autoencoder trained on CLIP's internal representations.
4
+
5
+ ## Model Details
6
+
7
+ ### Architecture
8
+ - **Layer**: 10
9
+ - **Layer Type**: hook_resid_post
10
+ - **Model**: open-clip:laion/CLIP-ViT-B-32-DataComp.XL-s13B-b90K
11
+ - **Dictionary Size**: 49152
12
+ - **Input Dimension**: 768
13
+ - **Expansion Factor**: 64
14
+ - **CLS Token Only**: True
15
+
16
+ ### Training
17
+ - **Training Images**: 114683904
18
+ - **Learning Rate**: 0.0002
19
+ - **L1 Coefficient**: 0.3000
20
+ - **Batch Size**: 4096
21
+ - **Context Size**: 1
22
+
23
+ ## Performance Metrics
24
+
25
+ ### Sparsity
26
+ - **L0 (Active Features)**: 64
27
+ - **Dead Features**: 0
28
+ - **Mean Log10 Feature Sparsity**: -3.5579
29
+ - **Features Below 1e-5**: 83
30
+ - **Features Below 1e-6**: 2
31
+ - **Mean Passes Since Fired**: 22.8832
32
+
33
+ ### Reconstruction
34
+ - **Explained Variance**: 0.8698
35
+ - **Explained Variance Std**: 0.0456
36
+ - **MSE Loss**: 0.0016
37
+ - **L1 Loss**: 0
38
+ - **Overall Loss**: 0.0016
39
+
40
+ ## Training Details
41
+ - **Training Duration**: 17901.0753 seconds
42
+ - **Final Learning Rate**: 0.0002
43
+ - **Warm Up Steps**: 200
44
+ - **Gradient Clipping**: 1
45
+
46
+ ## Additional Information
47
+ - **Weights & Biases Run**: https://wandb.ai/perceptual-alignment/clip/runs/10hacih1
48
+ - **Original Checkpoint Path**: /network/scratch/s/sonia.joseph/checkpoints/clip-b
49
+ - **Random Seed**: 42