Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Plaban81
/
Moe-4x7b-math-reason-code
like
1
Text Generation
Transformers
Safetensors
English
mixtral
code
QA
reasoning
maths
conversational
text-generation-inference
Inference Endpoints
arxiv:
1910.09700
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
Moe-4x7b-math-reason-code
/
generation_config.json
Plaban81
Upload MixtralForCausalLM
af3d34b
verified
10 months ago
raw
Copy download link
history
blame
contribute
delete
Safe
111 Bytes
{
"_from_model_config"
:
true
,
"bos_token_id"
:
1
,
"eos_token_id"
:
2
,
"transformers_version"
:
"4.37.1"
}