Uploaded model

  • Developed by: Sakalti
  • License: apache2.0
  • Finetuned from model : Sakalti/Saba1-1.8B This qwen model was trained 2x faster with Unsloth and Huggingface's TRL library.

This models was using "kunishou/databricks-dolly-15k-ja" This dataset is licensed under CC BY SA 3.0

Last Update : 2023-05-28 paramaters(no embedding layer) 1.31B paramaters(yes embedding layer) 1.54B layers: 28layers

概要

Saba1.5はSaba1をフゑむンチγƒ₯γƒΌγƒ‹γƒ³γ‚°γ—γŸγƒ’γƒ‡γƒ«γ§γ™γ€‚ ζ€§θƒ½γ―γΎγ γ‚γ‹γ‚ŠγΎγ›γ‚“γŒδΈŠζ˜‡γ—γ¦γ‚‹γ§γ—γ‚‡γ†γ€‚

Downloads last month
92
Safetensors
Model size
1.54B params
Tensor type
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Sakalti/Saba1.5-1.5B

Finetunes
1 model
Quantizations
3 models

Collection including Sakalti/Saba1.5-1.5B