bert-6L-768H / config.json
tobiaslee's picture
fix
421966d
raw
history blame contribute delete
310 Bytes
{"hidden_size": 768, "hidden_act": "gelu", "initializer_range": 0.02, "vocab_size": 30522, "hidden_dropout_prob": 0.1, "num_attention_heads": 12, "type_vocab_size": 2, "max_position_embeddings": 512, "num_hidden_layers": 6, "intermediate_size": 3072, "attention_probs_dropout_prob": 0.1, "model_type": "bert"}