File size: 24,295 Bytes
c6fa063
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
---
base_model: sentence-transformers/all-mpnet-base-v2
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: 'Alexis it Doesn’t Have To End Georgiawas invaded by Russia and lost its territoryof
    Ossetia and Abkhazia. What did USAdo? It condemned the invasion by issuinga statement.
    George Bush and Putin, bothguests at Beijing Olympic opening ceremony,argued.
    Georgia appreciates.

    '
- text: 'DLI believe she also married Aristotle Onassis, who owned the world''s largest
    private shipping fleet -- that may have helped finance her other life choices...

    '
- text: 'Remember watching this movie with my wife as newly weds in 1995. Wonderful
    evergreen film. Shahrukh was the son every father wants. And every girl wants
    as a boyfriend or husband. True love. The relationship between Anupam Kher and
    his son Shahrukh is pleasant and different than usual Punjabi father-son distant
    relationships. Music is beautiful! My children love this movie as well. I could
    watch it anytime-does not seem old or dated. Thank you Yash Chopra, Aditya Chopra,
    Shahrukh, Kajol and all of the team who brought us this beautiful human drama!

    '
- text: 'In the photo of the D''Alesandro family with Pres. Kennedy, I think it is
    telling that Mrs. D''Alesandro is doing the "adoring" look at Mr. D''Alesandro.
    Par for the course for a 1961 pol''s wife.Meanwhile their 21-year-old daughter
    Nancy already has her piercing eyes unabashedly fixed right on Kennedy. You can
    almost see her thinking, "This powerful man can do great things for the country.
    How do I get there?"And she did get there -- to within a couple heartbeats of
    the Presidency, and arguably a position far more powerful and effective over her
    career than if she''d taken a term in the White House.

    '
- text: 'Why is it that grown men feel free to do these sorts of things to young girls
    and that societies tolerate it?  Why is the girl the one who is put on trial instead
    of the man/men who are responsible for what they did to her?  Why is her life
    ruined?  Why are women forced to prove their virtue over and over after they''ve
    been sexually assaulted by a husband, a relative, a male friend, or a stranger?  The
    worst of all is that the girls, who are too young to marry, can still become pregnant
    and be forced to carry the pregnancy to term.  What does it do to both the children
    when one is the result of rape?  How does one deal with a child who exists through
    no fault of its own? We know this happens all over the world.  It happens here
    too.  Even if we''re a rich country and have "enlightened" attitudes, when we
    deny women of any age the right to control their reproductive lives, we are showing
    exactly how little we think of women.  On a personal note, my parents didn''t
    want to have me when they did.  When I was 16 my mother told me, in a fit of anger,
    that if it weren''t for the abortion laws (in the 1950s) I wouldn''t be here.  But
    I was not a child of rape.  I can''t imagine how that feels for the victim or
    the child (who is also a victim).  Is the answer education for both boys and girls?  Or
    is it forcing a real change in the attitudes societies have towards half of their
    population, the half that does much of the caring, loving, and raising of children?

    '
inference: true
model-index:
- name: SetFit with sentence-transformers/all-mpnet-base-v2
  results:
  - task:
      type: text-classification
      name: Text Classification
    dataset:
      name: Unknown
      type: unknown
      split: test
    metrics:
    - type: accuracy
      value: 0.9
      name: Accuracy
---

# SetFit with sentence-transformers/all-mpnet-base-v2

This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.

## Model Details

### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 384 tokens
- **Number of Classes:** 2 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)

### Model Labels
| Label | Examples                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 |
|:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| yes   | <ul><li>'There is an epic, romantic story between Daniel Barenboim and Jacqueline du Pré (one of the greatest cellists of all time) that goes back to the late 1960’s. She was a disciple of the great Russian cellist Mstislav Rostropovich, who was so impressed with her immense talent that he viewed the much younger Ms. du Pré as his equal and successor.On Christmas Eve of 1966 Jacqueline  du Pré met Daniel Barenboim in London, promptly converted to Judaism and married him in Israel in 1967. They went on to record exquisite music together and thus became “the golden couple” of classical music at that time.For all the romantics out there, they left a trail of recordings which includes what I consider the best-ever performance of Robert Schumann’s Cello Concerto. The combination of the young Barenboim and du Pré, both not yet 30 years old, and Schumann, the great romantic, was stunning. The cello (a 1712 Stradivarius) seemed to come alive, speaking directly to the heart, Baremboim was equally impeccable, and we all cried from beauty so sublime. I am now 84, and still get misty when I play it.Tragically, du Pré died at the young age of 42, making this chapter of Mr. Baremboim’s life incredibly poignant. The recording lives on and is still available.\n'</li><li>'Santos was once married to a woman, despite being gay. Did he do that to obtain American citizenship?He received campaign money from a businessman, Andrew Intrater, who cultivated close links with a onetime Trump confidant and who is the cousin of a sanctioned Russian oligarch, Russian billionaire Viktor Vekselberg, who has been sanctioned by the U.S. government for his role in the Russian energy industry. according to video footage and court documents.Harbor City, the company Santos worked for and is under investigation for a money scheme, was able to land a $625,000 deposit from a company registered in Mississippi that identifies Intrater as its lone officer, according to an exhibit included in the SEC’s complaint against Harbor City.After Harbor City’s assets were frozen, and with assistance from a fellow former Harbor City employee, Santos in 2021 formed a company, the Devolder Organization, that paid him at least $3.5 million over the next two years, according to Florida business records and financial disclosure forms he filed as a candidate. Santos loaned his campaign more than $700,000 but did not report any income from Harbor City despite having been paid by the company as recently as April 2021.Did that money come from Harbor City’s ponzu scheme or did it come from Russia through Intrater and is Santos in the pocket of Russia?Lots we don’t know, lots to investigate.\n'</li><li>"Yes, indeed, making close friends at work is a wonderful idea. I met a woman at work 48 years ago and we became great friends. She and her husband invited me to dinner one evening to meet an engineer who worked with her husband.  They both thought we might like each other.  They were certainly right about that.  We were engaged 3 months later and married three months after that.  We'll be celebrating our 47th wedding anniversary the end of this month.  Yup, close friends at work can be wonderful!\n"</li></ul>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         |
| no    | <ul><li>'Not surprisingly, this is one of the most astute columns I\'ve read recently about the ubiquity of guns in America and lack of common sense gun control laws. I\'ve experienced a situation where I saw a guy with a holstered gun on his hip walking toward the entry of a grocery where I was intending to go. (There was no indication at all that he was a member of law enforcement.) His whole posture was one of intimidation and when I perceived that I turned right around and left for a different store. Was my reaction fear? Instinctively it certainly was, so I took precaution. And as Bouie points out, I was deprived of my freedom: my choice and ability to shop at that store without fear, and so a forced resignation and imposed requirement that I change my shopping plans. (I think it\'s noteworthy too that the only people I\'ve seen open carry have all been white men. I\'ve never seen a black man open carry or a hispanic man, nor a woman. I think we probably know why: racism. If a black man walked into a store with a gun on his hip, in this country, he would immediately cause panic.)There is no reason why anyone needs to open carry in a public space unless they are law enforcement.Jokes have been made about the hubris of "duck & cover" drills from the 1950s-60s because of threat of nuclear war. Gun proliferation in America causes more death & greater threat to society than the possibility of nuclear war. The 2nd amendment needs to be amended to reflect common sense gun laws.\n'</li><li>'"At the same time, 45 percent said the pornography provided helpful information about sex. L.G.B.T.Q. teenagers, in particular, said it helped them discover more about their sexuality.“\'We have to be careful about saying all porn is good or bad,\' said Emily Rothman, a professor of community health sciences at Boston University. \'There is nuance here.\'”Gross. Somehow, since the beginning of time, young people, especially LGBTQ teens, have managed to discover more about their sexuality without themselves or all of us being inundated with pornography--and what we see today is not just porn but ubiquitous violence. Attitudes like Rothman\'s are why parents are fighting against school libraries offering sexuality explicit books about LGBTQ teens. You won\'t find sexually explicit books about straight sex in those libraries. There\'s no library market for those books. In the name of helping LGBTQ kids "discover" their sexuality, librarians and teachers justify exposing all teens to porn. Too much porn is too much porn. Because of all the porn, girls think it\'s normal for their boyfriends to choke them. Boys masterbate so often that they damage their brains\' abilities to regulate pleasure and wind up impotent. The normalization of porn has negatively impacted how younger people see relationships and marriage. Too much porn has also damaged how girls see themselves as embodied females.Enough. Justifying porn for teens as a tool for discovering sexuality hurts all teens.\n'</li><li>'CT1001  I hope that\'s not a rhetorical question, expecting "you don\'t" for an answer. Because people are doing it. Existing written records can reveal more than they ever intended about the lives of the oppressed... oral material can be looked at seriously... and "archeology" can merge smoothly into history if it involves, for instance, paying as much attention to the remnants of slave quarters, as to the slave-owners quarters... it\'s very appropriate to accuse the people who disappeared the slave quarters, while prettying up the owners residence as an attractive venue for weddings etc, during the hundred years of historical erasure that went on in this country.\n'</li></ul> |

## Evaluation

### Metrics
| Label   | Accuracy |
|:--------|:---------|
| **all** | 0.9      |

## Uses

### Direct Use for Inference

First install the SetFit library:

```bash
pip install setfit
```

Then you can load this model and run inference.

```python
from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("davidadamczyk/setfit-model-9")
# Run inference
preds = model("DLI believe she also married Aristotle Onassis, who owned the world's largest private shipping fleet -- that may have helped finance her other life choices...
")
```

<!--
### Downstream Use

*List how someone could finetune this model on their own dataset.*
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count   | 37  | 170.9  | 276 |

| Label | Training Sample Count |
|:------|:----------------------|
| no    | 18                    |
| yes   | 22                    |

### Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 120
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False

### Training Results
| Epoch  | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0017 | 1    | 0.5127        | -               |
| 0.0833 | 50   | 0.2133        | -               |
| 0.1667 | 100  | 0.0057        | -               |
| 0.25   | 150  | 0.0002        | -               |
| 0.3333 | 200  | 0.0001        | -               |
| 0.4167 | 250  | 0.0001        | -               |
| 0.5    | 300  | 0.0001        | -               |
| 0.5833 | 350  | 0.0001        | -               |
| 0.6667 | 400  | 0.0001        | -               |
| 0.75   | 450  | 0.0001        | -               |
| 0.8333 | 500  | 0.0001        | -               |
| 0.9167 | 550  | 0.0           | -               |
| 1.0    | 600  | 0.0           | -               |

### Framework Versions
- Python: 3.10.13
- SetFit: 1.1.0
- Sentence Transformers: 3.0.1
- Transformers: 4.45.2
- PyTorch: 2.4.0+cu124
- Datasets: 2.21.0
- Tokenizers: 0.20.0

## Citation

### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->