llama-2-tiny-random / README.md
yujiepan's picture
Create README.md
2ccb35c
|
raw
history blame
537 Bytes
metadata
pipeline_tag: text-generation
inference: true
widget:
  - text: Hello!
    example_title: Hello world
    group: Python
library_name: transformers

yujiepan/llama-2-tiny-random

This model is randomly initialized, using the config from meta-llama/Llama-2-7b-chat-hf but with the following modifications:

{
  "hidden_size": 8,
  "intermediate_size": 32,
  "num_attention_heads": 2,
  "num_hidden_layers": 1,
  "num_key_value_heads": 2,
}