File size: 790 Bytes
7374517
 
 
 
 
c547413
 
 
 
7374517
 
c547413
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
language: en
library_name: mlsae
license: mit
tags:
  - model_hub_mixin
  - pytorch_model_hub_mixin
datasets:
  - monology/pile-uncopyrighted
---

# mlsae-pythia-70m-deduped-x64-k16-tfm

A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream
activation vectors from every layer of
[EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped)
with an expansion factor of 64 and k = 16, over 1 billion tokens from
[monology/pile-uncopyrighted](https://huggingface.co/datasets/monology/pile-uncopyrighted).
This model includes the underlying transformer.

For more details, see:

- Paper: <https://arxiv.org/abs/2409.04185>
- GitHub repository: <https://github.com/tim-lawson/mlsae>
- Weights & Biases project: <https://wandb.ai/timlawson-/mlsae>