File size: 600 Bytes
fab5c9a ec3dc3e fab5c9a ec3dc3e fab5c9a ec3dc3e 0c7054a fab5c9a ec3dc3e dc95288 fab5c9a ec3dc3e fab5c9a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
language:
- ko
- en
library_name: transformers
pipeline_tag: text-generation
---
This model has been developed by KAIST ALIN Lab and OMNIOUS.AI - HyunseokLee, TaeyoungKim
**Input**
Models input text only.
**Output**
Models generate text only.
**Model Architecture**
ko-en-llama2-13b-aligned is an auto-regressive language model based on the LLaMA2 transformer architecture.
**Base Model**
hyunseoki/ko-en-llama2-13b
**Training Dataset**
Open dataset wiki and AIhub (English + Korean).
Supervised Finetuned with Instruction Dataset and aligned with Human Preference Dataset using DPO. |