This model is being trained with help of LORA technique on Bulgarian data from: https://www.kaggle.com/datasets/auhide/bulgarian-recipes-dataset/
This LLAMA version is 4bit encoded version of the 16bit LLAMA 2 7B model.
- Downloads last month
- 0
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated)
instead.