Funnyworld1412's picture
Add SetFit ABSA model
ce72367 verified
metadata
library_name: setfit
tags:
  - setfit
  - absa
  - sentence-transformers
  - text-classification
  - generated_from_setfit_trainer
base_model: sentence-transformers/all-MiniLM-L6-v2
metrics:
  - accuracy
widget:
  - text: >-
      hp:game yg grafiknya standar boros batrai bikin hp cepat panas game
      satunya brawlstar ga
  - text: >-
      game:game cocok indonesia gw main game dibilang berat squad buster
      jaringan game berat bagus squad buster main koneksi terputus koneksi aman
      aman aja mohon perbaiki jaringan
  - text: >-
      sinyal:prmainannya bagus sinyal diperbaiki maen game online gak bagus2 aja
      pingnya eh maen squad busters jaringannya hilang2 pas match klok sinyal
      udah hilang masuk tulisan server konek muat ulang gak masuk in game saran
      tolong diperbaiki ya min klok grafik gameplay udah bagus
  - text: >-
      saran semoga game:gamenya bagus kendala game nya kadang kadang suka
      jaringan jaringan bagus saran semoga game nya ditingkatkan disaat update
  - text: >-
      gameplay:gameplay nya bagus gk match nya optimal main kadang suka lag gitu
      sinyal nya bagus tolong supercell perbaiki sinyal
pipeline_tag: text-classification
inference: false
model-index:
  - name: SetFit Aspect Model with sentence-transformers/all-MiniLM-L6-v2
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: Unknown
          type: unknown
          split: test
        metrics:
          - type: accuracy
            value: 0.8316929133858267
            name: Accuracy

SetFit Aspect Model with sentence-transformers/all-MiniLM-L6-v2

This is a SetFit model that can be used for Aspect Based Sentiment Analysis (ABSA). This SetFit model uses sentence-transformers/all-MiniLM-L6-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification. In particular, this model is in charge of filtering aspect span candidates.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

This model was trained within the context of a larger system for ABSA, which looks like so:

  1. Use a spaCy model to select possible aspect span candidates.
  2. Use this SetFit model to filter these possible aspect span candidates.
  3. Use a SetFit model to classify the filtered aspect span candidates.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
aspect
  • 'pencarian lawan:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'game:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'bugnya:bugnya nakal banget y coc cr aja sukanya ngebug pas match suka hitam match relog kalo udah relog lawan udah 1 2 mahkota kecewa sih bintang nya 1 aja bug nya diurus bintang lawannya kadang g setara levelnya dahlah gk suka banget kalo main 2 vs 2 temen suka banget afk coba fitur report'
no aspect
  • 'player trophy mahkotanya jaraknya:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'peleton akun perbedaan level:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'y coc cr:bugnya nakal banget y coc cr aja sukanya ngebug pas match suka hitam match relog kalo udah relog lawan udah 1 2 mahkota kecewa sih bintang nya 1 aja bug nya diurus bintang lawannya kadang g setara levelnya dahlah gk suka banget kalo main 2 vs 2 temen suka banget afk coba fitur report'

Evaluation

Metrics

Label Accuracy
all 0.8317

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import AbsaModel

# Download from the 🤗 Hub
model = AbsaModel.from_pretrained(
    "Funnyworld1412/ABSA_mpnet_MiniLM-L6-aspect",
    "Funnyworld1412/ABSA_mpnet_MiniLM-L6-polarity",
)
# Run inference
preds = model("The food was great, but the venue is just way too busy.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 2 29.9357 80
Label Training Sample Count
no aspect 3834
aspect 1266

Training Hyperparameters

  • batch_size: (4, 4)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 5
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.2801 -
0.0039 50 0.2365 -
0.0078 100 0.1068 -
0.0118 150 0.3401 -
0.0157 200 0.2112 -
0.0196 250 0.3529 -
0.0235 300 0.2338 -
0.0275 350 0.2039 -
0.0314 400 0.2006 -
0.0353 450 0.2939 -
0.0392 500 0.2053 -
0.0431 550 0.2036 -
0.0471 600 0.2229 -
0.0510 650 0.105 -
0.0549 700 0.2222 -
0.0588 750 0.1815 -
0.0627 800 0.2915 -
0.0667 850 0.276 -
0.0706 900 0.1682 -
0.0745 950 0.2328 -
0.0784 1000 0.2422 -
0.0824 1050 0.2753 -
0.0863 1100 0.2292 -
0.0902 1150 0.0791 -
0.0941 1200 0.3849 -
0.0980 1250 0.0964 -
0.1020 1300 0.1612 -
0.1059 1350 0.2755 -
0.1098 1400 0.1133 -
0.1137 1450 0.038 -
0.1176 1500 0.3195 -
0.1216 1550 0.0091 -
0.1255 1600 0.3148 -
0.1294 1650 0.1693 -
0.1333 1700 0.2411 -
0.1373 1750 0.2463 -
0.1412 1800 0.2807 -
0.1451 1850 0.112 -
0.1490 1900 0.2623 -
0.1529 1950 0.2465 -
0.1569 2000 0.4591 -
0.1608 2050 0.0556 -
0.1647 2100 0.0962 -
0.1686 2150 0.4525 -
0.1725 2200 0.2674 -
0.1765 2250 0.1513 -
0.1804 2300 0.3457 -
0.1843 2350 0.1415 -
0.1882 2400 0.0454 -
0.1922 2450 0.0156 -
0.1961 2500 0.2741 -
0.2 2550 0.1334 -
0.2039 2600 0.1838 -
0.2078 2650 0.1346 -
0.2118 2700 0.1022 -
0.2157 2750 0.3999 -
0.2196 2800 0.0953 -
0.2235 2850 0.1201 -
0.2275 2900 0.111 -
0.2314 2950 0.1081 -
0.2353 3000 0.1926 -
0.2392 3050 0.1047 -
0.2431 3100 0.2367 -
0.2471 3150 0.2034 -
0.2510 3200 0.0824 -
0.2549 3250 0.0338 -
0.2588 3300 0.2468 -
0.2627 3350 0.0082 -
0.2667 3400 0.0023 -
0.2706 3450 0.1106 -
0.2745 3500 0.1315 -
0.2784 3550 0.004 -
0.2824 3600 0.0836 -
0.2863 3650 0.2716 -
0.2902 3700 0.1873 -
0.2941 3750 0.4066 -
0.2980 3800 0.1448 -
0.3020 3850 0.0137 -
0.3059 3900 0.3471 -
0.3098 3950 0.1144 -
0.3137 4000 0.0596 -
0.3176 4050 0.0377 -
0.3216 4100 0.3316 -
0.3255 4150 0.0709 -
0.3294 4200 0.0515 -
0.3333 4250 0.2029 -
0.3373 4300 0.1191 -
0.3412 4350 0.2397 -
0.3451 4400 0.492 -
0.3490 4450 0.1178 -
0.3529 4500 0.3647 -
0.3569 4550 0.0098 -
0.3608 4600 0.2114 -
0.3647 4650 0.2392 -
0.3686 4700 0.2194 -
0.3725 4750 0.0578 -
0.3765 4800 0.0771 -
0.3804 4850 0.1582 -
0.3843 4900 0.0643 -
0.3882 4950 0.1372 -
0.3922 5000 0.0308 -
0.3961 5050 0.1247 -
0.4 5100 0.3076 -
0.4039 5150 0.1152 -
0.4078 5200 0.2112 -
0.4118 5250 0.0042 -
0.4157 5300 0.0869 -
0.4196 5350 0.0196 -
0.4235 5400 0.2406 -
0.4275 5450 0.3306 -
0.4314 5500 0.2328 -
0.4353 5550 0.008 -
0.4392 5600 0.0388 -
0.4431 5650 0.3812 -
0.4471 5700 0.6268 -
0.4510 5750 0.4426 -
0.4549 5800 0.1407 -
0.4588 5850 0.297 -
0.4627 5900 0.2657 -
0.4667 5950 0.1767 -
0.4706 6000 0.0152 -
0.4745 6050 0.2344 -
0.4784 6100 0.0447 -
0.4824 6150 0.0675 -
0.4863 6200 0.3086 -
0.4902 6250 0.5258 -
0.4941 6300 0.0826 -
0.4980 6350 0.0079 -
0.5020 6400 0.1817 -
0.5059 6450 0.0767 -
0.5098 6500 0.0221 -
0.5137 6550 0.0419 -
0.5176 6600 0.2452 -
0.5216 6650 0.0232 -
0.5255 6700 0.0804 -
0.5294 6750 0.1752 -
0.5333 6800 0.0127 -
0.5373 6850 0.0454 -
0.5412 6900 0.1759 -
0.5451 6950 0.0435 -
0.5490 7000 0.0109 -
0.5529 7050 0.0162 -
0.5569 7100 0.0133 -
0.5608 7150 0.2363 -
0.5647 7200 0.4987 -
0.5686 7250 0.1149 -
0.5725 7300 0.4613 -
0.5765 7350 0.3837 -
0.5804 7400 0.2439 -
0.5843 7450 0.0014 -
0.5882 7500 0.0177 -
0.5922 7550 0.0051 -
0.5961 7600 0.0418 -
0.6 7650 0.0061 -
0.6039 7700 0.2205 -
0.6078 7750 0.1769 -
0.6118 7800 0.0071 -
0.6157 7850 0.2271 -
0.6196 7900 0.3049 -
0.6235 7950 0.0016 -
0.6275 8000 0.2263 -
0.6314 8050 0.0057 -
0.6353 8100 0.1408 -
0.6392 8150 0.0303 -
0.6431 8200 0.0026 -
0.6471 8250 0.1743 -
0.6510 8300 0.2078 -
0.6549 8350 0.1764 -
0.6588 8400 0.0127 -
0.6627 8450 0.2435 -
0.6667 8500 0.0527 -
0.6706 8550 0.247 -
0.6745 8600 0.002 -
0.6784 8650 0.0087 -
0.6824 8700 0.1866 -
0.6863 8750 0.0087 -
0.6902 8800 0.1589 -
0.6941 8850 0.1848 -
0.6980 8900 0.0298 -
0.7020 8950 0.0081 -
0.7059 9000 0.3057 -
0.7098 9050 0.2059 -
0.7137 9100 0.2154 -
0.7176 9150 0.0013 -
0.7216 9200 0.1961 -
0.7255 9250 0.0129 -
0.7294 9300 0.0021 -
0.7333 9350 0.2106 -
0.7373 9400 0.0008 -
0.7412 9450 0.1261 -
0.7451 9500 0.1948 -
0.7490 9550 0.013 -
0.7529 9600 0.208 -
0.7569 9650 0.2382 -
0.7608 9700 0.0054 -
0.7647 9750 0.1869 -
0.7686 9800 0.0334 -
0.7725 9850 0.0197 -
0.7765 9900 0.0057 -
0.7804 9950 0.0056 -
0.7843 10000 0.0043 -
0.7882 10050 0.0025 -
0.7922 10100 0.6808 -
0.7961 10150 0.043 -
0.8 10200 0.0536 -
0.8039 10250 0.2435 -
0.8078 10300 0.0051 -
0.8118 10350 0.0653 -
0.8157 10400 0.017 -
0.8196 10450 0.0036 -
0.8235 10500 0.1561 -
0.8275 10550 0.001 -
0.8314 10600 0.1975 -
0.8353 10650 0.2378 -
0.8392 10700 0.1276 -
0.8431 10750 0.0719 -
0.8471 10800 0.1951 -
0.8510 10850 0.0446 -
0.8549 10900 0.2045 -
0.8588 10950 0.0598 -
0.8627 11000 0.0094 -
0.8667 11050 0.1117 -
0.8706 11100 0.0528 -
0.8745 11150 0.0047 -
0.8784 11200 0.1492 -
0.8824 11250 0.2204 -
0.8863 11300 0.0089 -
0.8902 11350 0.0709 -
0.8941 11400 0.1111 -
0.8980 11450 0.0048 -
0.9020 11500 0.0173 -
0.9059 11550 0.2862 -
0.9098 11600 0.2745 -
0.9137 11650 0.0054 -
0.9176 11700 0.0074 -
0.9216 11750 0.0036 -
0.9255 11800 0.0869 -
0.9294 11850 0.2333 -
0.9333 11900 0.15 -
0.9373 11950 0.066 -
0.9412 12000 0.1742 -
0.9451 12050 0.0009 -
0.9490 12100 0.1246 -
0.9529 12150 0.1674 -
0.9569 12200 0.1937 -
0.9608 12250 0.0724 -
0.9647 12300 0.0044 -
0.9686 12350 0.0013 -
0.9725 12400 0.0313 -
0.9765 12450 0.0925 -
0.9804 12500 0.1742 -
0.9843 12550 0.2294 -
0.9882 12600 0.1073 -
0.9922 12650 0.038 -
0.9961 12700 0.1866 -
1.0 12750 0.0141 0.2274

Framework Versions

  • Python: 3.10.13
  • SetFit: 1.0.3
  • Sentence Transformers: 3.0.1
  • spaCy: 3.7.5
  • Transformers: 4.36.2
  • PyTorch: 2.1.2
  • Datasets: 2.19.2
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}