File size: 5,329 Bytes
3d7f6fa dc9b894 3d7f6fa ceb25ac d385d83 18b6845 d385d83 18b6845 d385d83 18b6845 d385d83 ceb25ac d983808 ceb25ac 18b6845 ceb25ac 18b6845 ceb25ac d983808 ceb25ac d983808 ceb25ac d2528a0 ceb25ac d385d83 c41917b d385d83 2cff33f d385d83 18b6845 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 |
---
license: apache-2.0
language:
- tr
library_name: transformers
pipeline_tag: text-classification
---
# Akbank Hackathon: DisasterTech - Our Contribution
---
![image/png](https://cdn-uploads.huggingface.co/production/uploads/62bdd8065f304e8ea762287f/raHCZDUuHPckwwrKDRz-A.png)
---
## 🎯 Introduction
**Akbank LAB** and **imece** teamed up to launch the **Akbank Hackathon: DisasterTech**, a beacon for innovators passionate about harnessing technology to revolutionize disaster management and relief. The event, which began on the 14th of October online and culminated at the Sabancı Center on the 22nd, saw a plethora of teams brainstorming and developing visionary solutions aimed at disaster alerts, preparedness, and post-calamity assistance.
In response to this call-to-action, our team stepped up, and this repository stands testament to the innovation we brought to the table during this monumental event.
For an in-depth look at the hackathon, feel free to visit [Akbank Hackathon: DisasterTech](https://www.akbanklab.com/tr/akbank-hackathon-disastertech#section-4).
---
### 🌪️ **Disaster Management Classification Overview** 🚨
📊 Our model, boasting a commendable accuracy of **89.09%**, is adept at swiftly classifying textual data into pivotal categories, proving invaluable during crisis management and relief efforts.
- 🏠 **Shelter Needs (Barınma İhtiyacı)**
- 🔌 **Electricity Source (Elektrik Kaynağı)**
- 💧 **Water Needs (Su İhtiyacı)**
- 🍲 **Food Needs (Yemek İhtiyacı)**
- 🚧 **Debris Removal Alerts (Enkaz Kaldırma İhbarı)**
- 🚑 **Emergency Health Assistance Requests (Acil Sağlık Yardımı Talebi)**
Our vigilant model doesn't stop there:
- ❌ It discerns non-relevant alerts, categorizing them as **Unrelated Reports (Alakasız İhbar)**.
- ⚠️ It stays alert to potential threats, recognizing **Looting Incident Reports (Yağma Olay Bildirimi)**.
Whether it's about ensuring 🚚 logistical support, 👕 clothing provisions, or 🔥 heating essentials, our model stands as a holistic solution for discerning and categorizing diverse requirements amidst disaster scenarios.
---
## 📊 Model Performance & Usage
In this document, you can find detailed insights regarding our classification model's performance.
- 🤗 [View Model on Hugging Face](https://huggingface.co/tarikkaankoc7/zeltech-akbank-hackathon)
#### 🎯 Overall Accuracy
- **Accuracy Metric**: 📈 89.09%
## 📝 Classification Report
| Class | Precision | Recall | F1-Score | Support |
|--------------------|-----------|--------|----------|---------|
| Alakasız İhbar | 0.90 | 0.92 | 0.91 | 327 |
| Barınma İhtiyacı | 0.90 | 0.90 | 0.90 | 124 |
| Elektrik Kaynağı | 0.82 | 0.93 | 0.87 | 58 |
| Enkaz Kaldırma İhbarı | 0.88 | 0.85 | 0.86 | 202 |
| Giysi İhtiyacı | 0.88 | 0.80 | 0.84 | 45 |
| Isınma İhtiyacı | 0.94 | 0.90 | 0.92 | 171 |
| Lojistik Destek Talebi | 0.90 | 0.86 | 0.88 | 63 |
| Acil Sağlık Yardımı Talebi | 0.88 | 0.82 | 0.85 | 34 |
| Su İhtiyacı | 0.86 | 0.91 | 0.89 | 220 |
| Yağma Olay Bildirimi | 1.00 | 1.00 | 1.00 | 15 |
| Yemek İhtiyacı | 0.90 | 0.88 | 0.89 | 226 |
| **Total/Avg** | **0.89** | **0.89**| **0.89** | **1485**|
## 🖥️ How to use the model
Here is a Python example demonstrating how to use the model for predicting class of a given text:
```python
from transformers import BertTokenizer, BertForSequenceClassification
from torch.nn.functional import softmax
import torch
model_name = "tarikkaankoc7/zeltech-akbank-hackathon"
model = BertForSequenceClassification.from_pretrained(model_name)
tokenizer = BertTokenizer.from_pretrained(model_name)
model.eval()
def predict(text):
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True, max_length=512)
with torch.no_grad():
outputs = model(**inputs)
probs = softmax(outputs.logits, dim=-1)
predicted_class_id = torch.argmax(probs, dim=-1).item()
predicted_class_name = model.config.id2label[predicted_class_id]
return predicted_class_name
text = "Hatay/Antakya odabaşı atatürk bulvarı ahmet gürses apartmanı arkadasım ilayda kürkçü enkaz altında paylaşır mısınız"
predicted_class_name = predict(text)
print(f"Predicted Class: {predicted_class_name}")
```
## Expected Output:
```bash
Predicted Class: Enkaz Kaldırma İhbarı
```
### 📘 **How to Use the Model?** 🚀
For a step-by-step guide on how to utilize our model effectively, kindly refer to our example notebook:
🔗 [Check out the DisasterTech_BERT_Classification Notebook!](https://github.com/Zeltech-Akbank/DisasterTech_BERT_Classification/blob/main/DisasterTech_BERT_Classification.ipynb)
## 🖋️ Authors
- **Şeyma SARIGIL** - [📧 Email](mailto:seymasargil@gmail.com)
- **Tarık Kaan KOÇ** - [📧 Email](mailto:tarikkaan1koc@gmail.com)
- **Alaaddin Erdinç DAL** - [📧 Email](mailto:aerdincdal@icloud.com)
- **Anıl YAĞIZ** - [📧 Email](mailto:anill.yagiz@gmail.com) |