File size: 2,824 Bytes
89fc5cb 51070bd 89fc5cb 51070bd 89fc5cb 51070bd 89fc5cb 83c2695 4d7c378 83c2695 c88955f 83c2695 c88955f 83c2695 dc06988 c88955f dc06988 83c2695 c88955f 83c2695 c88955f 83c2695 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 |
---
license: apache-2.0
language:
- tr
tags:
- deprem-clf-v1
metrics:
- accuracy
- recall
- f1
library_name: transformers
pipeline_tag: text-classification
model-index:
- name: deprem_v12
results:
- task:
type: text-classification
dataset:
type: deprem_private_dataset_v1_2
name: deprem_private_dataset_v1_2
metrics:
- type: recall
value: 0.75
verified: false
- type: f1
value: 0.75
verified: false
widget:
- text: >-
HATAY DEFNE İLÇESİNE yardımlar gitmiyor Özellikle çadıra battaniyeye yiyeceğe ihtiyaç var. Antakyanın dışında olduğu için tüm yardimlar
İSKENDERUNA ANTAKYAYA gidiyor. Bu bölgeye gitmiyor..DEFNE İLÇESİNE GİDECEK ERZAK Çadır yardımlarını
example_title: Örnek
---
**Train-Test Set:** "intent-multilabel-v1-2.zip"
**Model:** "dbmdz/bert-base-turkish-cased"
## Tokenizer Params
```
max_length=128
padding="max_length"
truncation=True
```
## Training Params
```
evaluation_strategy = "epoch"
save_strategy = "epoch"
per_device_train_batch_size = 16
per_device_eval_batch_size = 16
num_train_epochs = 4
load_best_model_at_end = True
```
## Train-Val Splitting Configuration
```
train_test_split(df_train,
test_size=0.1,
random_state=1111)
```
## Class Loss Weights
- **Alakasiz:** 1.0
- **Barinma:** 1.5167249178108022
- **Elektronik:** 1.7547338578655642
- **Giysi:** 1.9610520059358458
- **Kurtarma:** 1.269341370129623
- **Lojistik:** 1.8684086209021484
- **Saglik:** 1.8019018017117145
- **Su:** 2.110648663094536
- **Yagma:** 3.081208739200435
- **Yemek:** 1.7994815143101963
## Training Log (Class-Scaled)
```
Epoch Training Loss Validation Loss
1 No log 0.216295
2 0.260000 0.171498
3 0.142700 0.175608
4 0.142700 0.169851
```
## Threshold Optimization
- **Best Threshold:** 0.15
- **F1 @ Threshold:** 0.7503
## Eval Results
```
precision recall f1-score support
Alakasiz 0.91 0.87 0.89 734
Barinma 0.85 0.81 0.83 207
Elektronik 0.72 0.78 0.75 130
Giysi 0.73 0.67 0.70 94
Kurtarma 0.86 0.81 0.83 362
Lojistik 0.68 0.56 0.62 112
Saglik 0.72 0.81 0.76 108
Su 0.61 0.69 0.65 78
Yagma 0.67 0.65 0.66 31
Yemek 0.79 0.85 0.82 117
micro avg 0.82 0.81 0.81 1973
macro avg 0.75 0.75 0.75 1973
weighted avg 0.83 0.81 0.81 1973
samples avg 0.84 0.84 0.83 1973
``` |