--- license: other license_name: helpingai license_link: https://helpingai.co/license pipeline_tag: text-generation language: - en tags: - HelpingAI - Cipher - Code Generation - Programming - AI Assistant library_name: transformers ---
💻 Cipher-20B
GitHub Organization Hugging Face Model License Join Community Discussion
[📜 License](https://helpingai.co/license) | [🌐 Website](https://helpingai.co)
Model Size Task Deployment Speed
## 🌟 About Cipher-20B **Cipher-20B** is a 20 billion parameter causal language model designed for code generation. ### 💻 Implementation ### Using Transformers ```python from transformers import AutoModelForCausalLM, AutoTokenizer # Load Cipher-20B model = AutoModelForCausalLM.from_pretrained("HelpingAI/Cipher-20B") tokenizer = AutoTokenizer.from_pretrained("HelpingAI/Cipher-20B") # Example usage code_task = [ {"role": "system", "content": "You are Cipher"}, {"role": "user", "content": "Write a Python function to calculate the Fibonacci sequence."} ] inputs = tokenizer.apply_chat_template( code_task, add_generation_prompt=True, return_tensors="pt" ) outputs = model.generate( inputs, max_new_tokens=256, temperature=0.7, top_p=0.9, ) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` ## ⚙️ Training Details ### Training Data * Trained on a large dataset of code, programming tasks, and technical documentation. * Fine-tuned for multiple programming languages like Python, JavaScript, and C++. ### Capabilities * Generates code in multiple languages. * Detects and corrects common coding errors. * Provides clear explanations of code. ## ⚠️ Limitations * May generate verbose code depending on the input. * Long code generation may exceed token limits. * Ambiguous instructions can lead to incomplete or incorrect code. * Prioritizes efficiency in code generation. ### Safety * Avoids generating harmful or malicious code. * Will not assist with illegal or unethical activities. ## 📚 Citation ```bibtex @misc{cipher2024, author = {Abhay Koul}, title = {Cipher-20B: Your Ultimate Code Buddy}, year = {2024}, publisher = {HelpingAI}, journal = {HuggingFace}, howpublished = {\url{https://huggingface.co/HelpingAI/Cipher-20B}} } ``` *Built with dedication, precision, and passion by HelpingAI* [Website](https://helpingai.co) • [GitHub](https://github.com/HelpingAI) • [Discord](https://discord.gg/YweJwNqrnH) • [HuggingFace](https://huggingface.co/HelpingAI)