Edit model card

SetFit with sentence-transformers/all-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
feature
  • 'Add Accessibility View and Accessibility Help Menu for Copilot Inline Suggestions Testing #186214\r\n\r\nIf you already have an issue for future plans feel free to close this one.\r\nI think the Copilot Inline (Ghost Text) Suggestions could also profit from the Accessibility View. Currently users are depending on the Copilot View on the side which might get deprecated in the future. And I think we need a native solution for this, so IntelliCode also profits.\r\nI am not sure how exactly to integrate this with the Inline Suggestion Hover.\r\n\r\nA potential start would be. Whenever there is an Inline Suggestion, we play an audio cue (as we do today), and a user can open Accessibility View to inspect the exact suggestion.\r\n\r\nfyi @hediet\r\n'
  • 'Windows is ignore terminal titles unless they look like paths From @NoelAbrahams in https://github.com/microsoft/vscode/issues/172821#issuecomment-1712935578\n\n> I've been trying to programmatically set the title of a bash terminal without success, and glad that I chanced upon this issue.\n> \n> Here is the repro:\n> \n> System: Windows 11.\n> Git Bash installed.\n> VSCode version: 1.82.0\n> \n> VSCode Workspace Settings:\n> \n> image\n> \n> \n> .bashrc\n> shell\n> echo -en "\\\\033]0;New terminal title\\\\a"\n> \n> \n> The title of the bash terminal window remains stubbornly unchanged as bash myworkspacefolder.\n> \n> I'm assuming this issue will fix my use case.\n> \n> Thanks\n\n---\n\nThis section is setting it to undefined:\n\nhttps://github.com/microsoft/vscode/blob/41e940f76f5deda197bc5930b044c55607ba1cbc/src/vs/workbench/contrib/terminal/browser/terminalInstance.ts#L1933-L1946\n\nWe want to keep the path trimming behavior above, but only when it looks like a path.'
  • "Toggle Inline Diff not discoverable in inline chat Followup to https://github.com/microsoft/vscode/issues/185040#issuecomment-1612399427\r\n\r\nI spent quite some time messing with the inline chat preview mode setting to get the diff to appear, without success. It didn't occur to me to look in the Discard dropdown for the Toggle Inline Diff action, since I don't associate diffing with discarding. Have we considered surfacing this another way, maybe as a separate icon or through a keybinding with some helper text below the result?"
question
  • 'error \nType: Bug\n\n\n\nUser\nMicrosoft Windows [Versión 10.0.22621.1992]\n(c) Microsoft Corporation. Todos los derechos reservados.\n\nC:\\Users\\primo\\OneDrive\\Escritorio\\otro\\ClonTwitter>rails s\n=> Booting Puma\n=> Rails 7.0.6 application starting in development \n=> Run bin/rails server --help for more startup options\n*** SIGUSR2 not implemented, signal based restart unavailable!\n*** SIGUSR1 not implemented, signal based restart unavailable!\n*** SIGHUP not implemented, signal based logs reopening unavailable!\nPuma starting in single mode...\n* Version 5.0.0 (ruby 3.1.3-p185), codename: Spoony Bard\n* Min threads: 5, max threads: 5\n* Environment: development\nExiting\nC:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:242:in initialize\': Only one usage of each socket address (protocol/network address/port) is normally permitted. - bind(2) for "127.0.0.1" port 3000 (Errno::EADDRINUSE)\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:242:in new'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:242:in add_tcp_listener\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:236:in block in add_tcp_listener'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:235:in each\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:235:in add_tcp_listener'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:122:in block in parse\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:106:in each'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/binder.rb:106:in parse\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/runner.rb:137:in load_and_bind'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/single.rb:43:in run\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/puma/launcher.rb:171:in run'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/puma-5.0.0/lib/rack/handler/puma.rb:71:in run\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/rack-2.2.7/lib/rack/server.rb:327:in start'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/railties-7.0.6/lib/rails/commands/server/server_command.rb:38:in start\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/railties-7.0.6/lib/rails/commands/server/server_command.rb:143:in block in perform'\n from internal:kernel:90:in tap\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/railties-7.0.6/lib/rails/commands/server/server_command.rb:134:in perform'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/thor-1.2.2/lib/thor/command.rb:27:in run\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/thor-1.2.2/lib/thor/invocation.rb:127:in invoke_command'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/thor-1.2.2/lib/thor.rb:392:in dispatch\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/railties-7.0.6/lib/rails/command/base.rb:87:in perform'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/railties-7.0.6/lib/rails/command.rb:48:in invoke\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/railties-7.0.6/lib/rails/commands.rb:18:in
    '\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/bootsnap-1.16.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in require\'\n from C:/Ruby31-x64/lib/ruby/gems/3.1.0/gems/bootsnap-1.16.0/lib/bootsnap/load_path_cache/core_ext/kernel_require.rb:32:in require'\n from bin/rails:4:in `
    '\n\nVS Code version: Code 1.80.2 (2ccd690cbff1569e4a83d7c43d45101f817401dc, 2023-07-27T20:40:28.909Z)\nOS version: Windows_NT x64 10.0.22621\nModes:\n\n
    \nSystem Info\n\n
bug

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("setfit_model_id")
# Run inference
preds = model("Notebook toolbar foreground color cannot be modified by custom styles: The unavailable foreground color has been marked with a red arrow, please see the image


Expected behavior:
Notebook toolbar foreground color can be modified through custom styles.

Unexpected behavior:
Notebook toolbar foreground color cannot be modified through custom styles.

VS Code Version: 1.81 | 1.82
OS Version: Windows10
")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 5 118.4567 1482
Label Training Sample Count
bug 200
feature 200
question 200

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 20
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0007 1 0.2896 -
0.0067 10 0.262 -
0.0133 20 0.2299 -
0.02 30 0.2345 -
0.0267 40 0.235 -
0.0333 50 0.2213 -
0.04 60 0.3084 -
0.0467 70 0.2107 -
0.0533 80 0.1596 -
0.06 90 0.1916 -
0.0667 100 0.2366 -
0.0733 110 0.1513 -
0.08 120 0.1281 -
0.0867 130 0.2217 -
0.0933 140 0.1859 -
0.1 150 0.1835 -
0.1067 160 0.1312 -
0.1133 170 0.1415 -
0.12 180 0.1287 -
0.1267 190 0.1377 -
0.1333 200 0.1116 -
0.14 210 0.0769 -
0.1467 220 0.0548 -
0.1533 230 0.0647 -
0.16 240 0.0348 -
0.1667 250 0.0165 -
0.1733 260 0.0043 -
0.18 270 0.0038 -
0.1867 280 0.0673 -
0.1933 290 0.0458 -
0.2 300 0.0048 -
0.2067 310 0.0054 -
0.2133 320 0.0019 -
0.22 330 0.0052 -
0.2267 340 0.0103 -
0.2333 350 0.0163 -
0.24 360 0.0022 -
0.2467 370 0.0009 -
0.2533 380 0.0006 -
0.26 390 0.001 -
0.2667 400 0.0011 -
0.2733 410 0.0005 -
0.28 420 0.0007 -
0.2867 430 0.0006 -
0.2933 440 0.0005 -
0.3 450 0.0012 -
0.3067 460 0.0006 -
0.3133 470 0.0004 -
0.32 480 0.0006 -
0.3267 490 0.0009 -
0.3333 500 0.001 -
0.34 510 0.0003 -
0.3467 520 0.0003 -
0.3533 530 0.0005 -
0.36 540 0.0002 -
0.3667 550 0.0004 -
0.3733 560 0.0603 -
0.38 570 0.0014 -
0.3867 580 0.0007 -
0.3933 590 0.0005 -
0.4 600 0.0004 -
0.4067 610 0.0053 -
0.4133 620 0.0002 -
0.42 630 0.0002 -
0.4267 640 0.0008 -
0.4333 650 0.0001 -
0.44 660 0.0002 -
0.4467 670 0.0001 -
0.4533 680 0.0002 -
0.46 690 0.0002 -
0.4667 700 0.0001 -
0.4733 710 0.0003 -
0.48 720 0.0001 -
0.4867 730 0.0001 -
0.4933 740 0.0002 -
0.5 750 0.0001 -
0.5067 760 0.0002 -
0.5133 770 0.0002 -
0.52 780 0.0001 -
0.5267 790 0.0001 -
0.5333 800 0.0001 -
0.54 810 0.0001 -
0.5467 820 0.0002 -
0.5533 830 0.0001 -
0.56 840 0.0001 -
0.5667 850 0.0001 -
0.5733 860 0.0002 -
0.58 870 0.0001 -
0.5867 880 0.0002 -
0.5933 890 0.0002 -
0.6 900 0.0002 -
0.6067 910 0.0001 -
0.6133 920 0.0001 -
0.62 930 0.0001 -
0.6267 940 0.0001 -
0.6333 950 0.0001 -
0.64 960 0.0001 -
0.6467 970 0.0001 -
0.6533 980 0.0001 -
0.66 990 0.0001 -
0.6667 1000 0.0001 -
0.6733 1010 0.0001 -
0.68 1020 0.0001 -
0.6867 1030 0.0001 -
0.6933 1040 0.0001 -
0.7 1050 0.0001 -
0.7067 1060 0.0002 -
0.7133 1070 0.0001 -
0.72 1080 0.0001 -
0.7267 1090 0.0001 -
0.7333 1100 0.0001 -
0.74 1110 0.0002 -
0.7467 1120 0.0001 -
0.7533 1130 0.0001 -
0.76 1140 0.0001 -
0.7667 1150 0.0001 -
0.7733 1160 0.0001 -
0.78 1170 0.0001 -
0.7867 1180 0.0001 -
0.7933 1190 0.0002 -
0.8 1200 0.0001 -
0.8067 1210 0.0001 -
0.8133 1220 0.0001 -
0.82 1230 0.0001 -
0.8267 1240 0.0 -
0.8333 1250 0.0 -
0.84 1260 0.0002 -
0.8467 1270 0.0001 -
0.8533 1280 0.0001 -
0.86 1290 0.0001 -
0.8667 1300 0.0001 -
0.8733 1310 0.0001 -
0.88 1320 0.0001 -
0.8867 1330 0.0 -
0.8933 1340 0.0001 -
0.9 1350 0.0001 -
0.9067 1360 0.0001 -
0.9133 1370 0.0001 -
0.92 1380 0.0001 -
0.9267 1390 0.0001 -
0.9333 1400 0.0001 -
0.94 1410 0.0 -
0.9467 1420 0.0001 -
0.9533 1430 0.0001 -
0.96 1440 0.0001 -
0.9667 1450 0.0001 -
0.9733 1460 0.0001 -
0.98 1470 0.0001 -
0.9867 1480 0.0001 -
0.9933 1490 0.0001 -
1.0 1500 0.0001 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 3.0.1
  • Transformers: 4.39.0
  • PyTorch: 2.3.0+cu121
  • Datasets: 2.20.0
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
37
Safetensors
Model size
109M params
Tensor type
F32
·
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from