This is the first 4 layers of DeepSeek-V3 in bf16.

To load and run this model:

from transformers import AutoModelForCausalLM, AutoTokenizer

pretrained_model_id = "/root/dataDisk/DeepSeek-V3-bf16-4layers"

tokenizer = AutoTokenizer.from_pretrained(pretrained_model_id, use_fast=True)
model = AutoModelForCausalLM.from_pretrained(pretrained_model_id, trust_remote_code=True, device_map="auto")
print(tokenizer.decode(model.generate(**tokenizer("gptqmodel is", return_tensors="pt").to(model.device), max_new_tokens=10)[0]))
Downloads last month
11
Safetensors
Model size
15.1B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for tflsxyy/DeepSeek-V3-bf16-4layers

Finetuned
(75)
this model