language: fa tags: - question-answering - llama3 - Persian - QA license: apache-2.0 model_name: Llama-3.1-PersianQA

Model Card for Llama-3.1-PersianQA

Model Description

The Llama-3.1-PersianQA model is a fine-tuned version of Llama3 for Persian question-answering tasks. This model is designed to provide accurate answers to questions posed in Persian, based on the provided context. It has been trained on a dataset specific to Persian language QA tasks to enhance its performance in understanding and generating responses in Persian.

Intended Use

This model is intended for use in applications requiring Persian language question answering. It can be integrated into chatbots, virtual assistants, and other systems where users interact in Persian and need accurate responses to their questions based on a given context.

Use Cases

  • Customer Support: Automate responses to customer queries in Persian.
  • Educational Tools: Provide assistance and answers to questions in Persian educational platforms.
  • Content Retrieval: Extract relevant information from Persian texts based on user queries.

Training Data

The model was fine-tuned on a Persian question-answering dataset, which includes various domains and topics to ensure generalization across different types of questions. The dataset used for training contains question-context pairs and corresponding answers in Persian.

Model Architecture

  • Base Model: Llama3
  • Task: Question Answering
  • Language: Persian

Performance

The model has been evaluated on a set of Persian QA benchmarks and performs well across various metrics. Performance may vary depending on the specific domain and nature of the questions.

How to Use

You can use the Llama-3.1-PersianQA model with the Hugging Face transformers library. Here is a sample code to get started:

from transformers import pipeline

# Load the model
qa_pipeline = pipeline("question-answering", model="zpm/Llama-3.1-PersianQA")

# Example usage
context = "شرکت فولاد مبارکۀ اصفهان، بزرگ‌ترین واحد صنعتی خصوصی در ایران و بزرگ‌ترین مجتمع تولید فولاد در خاورمیانه است."
question = "شرکت فولاد مبارکه در کجا واقع شده است؟"

result = qa_pipeline(question=question, context=context)
print(result)
Downloads last month
3,342
GGUF
Model size
8.03B params
Architecture
llama

4-bit

16-bit

Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.