Datasets:
Tasks:
Text Classification
Modalities:
Text
Formats:
json
Languages:
Luxembourgish
Size:
10K - 100K
License:
Update README.md
Browse files
README.md
CHANGED
@@ -30,7 +30,7 @@ configs: # Optional. This can be used to pass additional parameters to the data
|
|
30 |
# Dataset Card for Luxembourgish Entailment-based Topic classification via Zero-shot learning (LETZ)
|
31 |
|
32 |
## Dataset Summary
|
33 |
-
The datasets for **L**uxembourgish **E**ntailment-based **T**opic classification via **Z**ero-shot learning (**LETZ**) can be used to adapt language models to zero-shot classification in Luxembourgish. It leverages data from the [*Luxembourg Online Dictionary*](https://lod.lu) to provide relevant topic classification examples in Luxembourgish
|
34 |
|
35 |
## Columns in the Dataset
|
36 |
|
@@ -40,6 +40,10 @@ Each dataset includes the following columns:
|
|
40 |
* **Label**: The potentially associated topic label.
|
41 |
* **Class**: A binary indicator where “1” denotes relevance (entailment) and “0” denotes irrelevance (non-entailment).
|
42 |
|
|
|
|
|
|
|
|
|
43 |
## Citation Information
|
44 |
```
|
45 |
@inproceedings{philippy-etal-2024-forget,
|
@@ -56,8 +60,7 @@ Each dataset includes the following columns:
|
|
56 |
address = "Torino, Italia",
|
57 |
publisher = "ELRA and ICCL",
|
58 |
url = "https://aclanthology.org/2024.sigul-1.13",
|
59 |
-
pages = "97--104"
|
60 |
-
abstract = "In NLP, zero-shot classification (ZSC) is the task of assigning labels to textual data without any labeled examples for the target classes. A common method for ZSC is to fine-tune a language model on a Natural Language Inference (NLI) dataset and then use it to infer the entailment between the input document and the target labels. However, this approach faces certain challenges, particularly for languages with limited resources. In this paper, we propose an alternative solution that leverages dictionaries as a source of data for ZSC. We focus on Luxembourgish, a low-resource language spoken in Luxembourg, and construct two new topic relevance classification datasets based on a dictionary that provides various synonyms, word translations and example sentences. We evaluate the usability of our dataset and compare it with the NLI-based approach on two topic classification tasks in a zero-shot manner. Our results show that by using the dictionary-based dataset, the trained models outperform the ones following the NLI-based approach for ZSC. While we focus on a single low-resource language in this study, we believe that the efficacy of our approach can also transfer to other languages where such a dictionary is available.",
|
61 |
}
|
62 |
```
|
63 |
|
|
|
30 |
# Dataset Card for Luxembourgish Entailment-based Topic classification via Zero-shot learning (LETZ)
|
31 |
|
32 |
## Dataset Summary
|
33 |
+
The datasets for **L**uxembourgish **E**ntailment-based **T**opic classification via **Z**ero-shot learning (**LETZ**) can be used to adapt language models to zero-shot classification in Luxembourgish. It leverages data from the [*Luxembourg Online Dictionary*](https://lod.lu) to provide relevant topic classification examples in Luxembourgish. The LETZ datasets were created to address the limitations of using Natural Language Inference (NLI) datasets for zero-shot classification in low-resource languages. Specifically, they aim to improve topic classification performance by providing more relevant and accessible data through dictionary entries.
|
34 |
|
35 |
## Columns in the Dataset
|
36 |
|
|
|
40 |
* **Label**: The potentially associated topic label.
|
41 |
* **Class**: A binary indicator where “1” denotes relevance (entailment) and “0” denotes irrelevance (non-entailment).
|
42 |
|
43 |
+
## Dataset Description
|
44 |
+
- **Repository:** [fredxlpy/LETZ](https://github.com/fredxlpy/LETZ)
|
45 |
+
- **Paper:** [Forget NLI, Use a Dictionary: Zero-Shot Topic Classification for Low-Resource Languages with Application to Luxembourgish (Philippy et al., SIGUL-WS 2024)](https://aclanthology.org/2024.sigul-1.13/)
|
46 |
+
|
47 |
## Citation Information
|
48 |
```
|
49 |
@inproceedings{philippy-etal-2024-forget,
|
|
|
60 |
address = "Torino, Italia",
|
61 |
publisher = "ELRA and ICCL",
|
62 |
url = "https://aclanthology.org/2024.sigul-1.13",
|
63 |
+
pages = "97--104"
|
|
|
64 |
}
|
65 |
```
|
66 |
|