Edit model card

A randomly initialized checkpoint of a 252M custom transformer architecture with two linear transformations from the llama2-70b embeddings to 1024-dimensional space from 8192-d and then back from 1024-d to 8192-d for the llama2-70b language modelling head.

To be trained

Downloads last month
3
Safetensors
Model size
776M params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.