xshubhamx's picture
Upload README.md with huggingface_hub
06026ac verified
|
raw
history blame
1.23 kB
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- xshubhamx/legal-bert-base-uncased
- xshubhamx/InLegalBERT
---
## Metrics
- loss: 0.6616
- accuracy: 0.8443
- precision: 0.8473
- recall: 0.8443
- precision_macro: 0.8200
- recall_macro: 0.7913
- macro_fpr: 0.0134
- weighted_fpr: 0.0130
- weighted_specificity: 0.9802
- macro_specificity: 0.9883
- weighted_sensitivity: 0.8443
- macro_sensitivity: 0.7913
- f1_micro: 0.8443
- f1_macro: 0.7980
- f1_weighted: 0.8435
- runtime: 28.7426
- samples_per_second: 44.9160
- steps_per_second: 5.6360
# legal-InLegal-merge-ties
legal-InLegal-merge-ties is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [xshubhamx/legal-bert-base-uncased](https://huggingface.co/xshubhamx/legal-bert-base-uncased)
* [xshubhamx/InLegalBERT](https://huggingface.co/xshubhamx/InLegalBERT)
## 🧩 Configuration
```yaml
models:
- model: xshubhamx/legal-bert-base-uncased
parameters:
density: 0.5
weight: 0.5
- model: xshubhamx/InLegalBERT
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: xshubhamx/legal-bert-base-uncased
parameters:
normalize: false
int8_mask: true
dtype: float16
```