πŸ’» Cipher-20B
GitHub Organization Hugging Face Model License Join Community Discussion
[πŸ“œ License](https://helpingai.co/license) | [🌐 Website](https://helpingai.co)
Model Size Task Deployment Speed

🌟 About Cipher-20B

Cipher-20B is a 20 billion parameter causal language model designed for code generation.

πŸ’» Implementation

Using Transformers

from transformers import AutoModelForCausalLM, AutoTokenizer

# Load Cipher-20B
model = AutoModelForCausalLM.from_pretrained("HelpingAI/Cipher-20B")
tokenizer = AutoTokenizer.from_pretrained("HelpingAI/Cipher-20B")

# Example usage
code_task = [
    {"role": "system", "content": "You are Cipher"},
    {"role": "user", "content": "Write a Python function to calculate the Fibonacci sequence."}
]

inputs = tokenizer.apply_chat_template(
    code_task,
    add_generation_prompt=True,
    return_tensors="pt"
)

outputs = model.generate(
    inputs,
    max_new_tokens=256,
    temperature=0.7,
    top_p=0.9,
)

print(tokenizer.decode(outputs[0], skip_special_tokens=True))

βš™οΈ Training Details

Training Data

  • Trained on a large dataset of code, programming tasks, and technical documentation.
  • Fine-tuned for multiple programming languages like Python, JavaScript, and C++.

Capabilities

  • Generates code in multiple languages.
  • Detects and corrects common coding errors.
  • Provides clear explanations of code.

⚠️ Limitations

  • May generate verbose code depending on the input.
  • Long code generation may exceed token limits.
  • Ambiguous instructions can lead to incomplete or incorrect code.
  • Prioritizes efficiency in code generation.

Safety

  • Avoids generating harmful or malicious code.
  • Will not assist with illegal or unethical activities.

πŸ“š Citation

@misc{cipher2024,
  author = {Abhay Koul},
  title = {Cipher-20B: Your Ultimate Code Buddy},
  year = {2024},
  publisher = {HelpingAI},
  journal = {HuggingFace},
  howpublished = {\url{https://huggingface.co/HelpingAI/Cipher-20B}}
}

Built with dedication, precision, and passion by HelpingAI

Website β€’ GitHub β€’ Discord β€’ HuggingFace

Downloads last month
42
Safetensors
Model size
20.6B params
Tensor type
BF16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for HelpingAI/Cipher-20B

Quantizations
2 models