File size: 535 Bytes
2ccb35c
74bb065
2ccb35c
 
 
74bb065
2ccb35c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
---
library_name: transformers
pipeline_tag: text-generation
inference: true
widget:
- text: Hello!
  example_title: Hello world
  group: Python
---

# yujiepan/llama-2-tiny-random

This model is **randomly initialized**, using the config from [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/yujiepan/llama-2-tiny-random/blob/main/config.json) but with the following modifications:

```json
{
  "hidden_size": 8,
  "intermediate_size": 32,
  "num_attention_heads": 2,
  "num_hidden_layers": 1,
  "num_key_value_heads": 2,
}

```