LLM-jp-3 Fine-tuned Models Collection Fine-tuned models in the LLM-jp-3 model series • 5 items • Updated Nov 15 • 1
LLM-jp-3 Pre-trained Models Collection Pre-trained models in the LLM-jp-3 model series • 5 items • Updated Nov 15 • 4
view article Article How to generate text: using different decoding methods for language generation with Transformers Mar 1, 2020 • 127
PLaMo-100B: A Ground-Up Language Model Designed for Japanese Proficiency Paper • 2410.07563 • Published Oct 10 • 2
gemma-2-baku Collection The baku model series are based on the gemma-2 series and have been continually pre-trained on Japanese-specific corpora. • 4 items • Updated 16 days ago • 3
Gemma 2 JPN Release Collection A Gemma 2 2B model fine-tuned on Japanese text. It supports the Japanese language the same level of performance of EN only queries on Gemma 2. • 3 items • Updated 8 days ago • 26
Japanese SimCSE Collection Tsukagoshi et al., Japanese SimCSE Technical Report, arXiv 2023. https://arxiv.org/abs/2310.19349 • 5 items • Updated Sep 4 • 2
llama-3-youko Collection The youko model series are based on the llama-3 series and have been continually pre-trained on Japanese-specific corpora. • 9 items • Updated 16 days ago • 1
Llama 3.1 GPTQ, AWQ, and BNB Quants Collection Optimised Quants for high-throughput deployments! Compatible with Transformers, TGI & VLLM 🤗 • 9 items • Updated Sep 26 • 56
Sarashina Collection Large Language Models developed by SB Intuitions • 8 items • Updated 9 days ago • 5
LLM-jp: A Cross-organizational Project for the Research and Development of Fully Open Japanese LLMs Paper • 2407.03963 • Published Jul 4 • 15