English
File size: 315 Bytes
87bb983
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
---
license: mit
datasets:
- EleutherAI/pile
language:
- en
---

These SAEs were trained on the outputs of each of the MLPs in [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m). We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.