SetFit with mini1013/master_domain

This is a SetFit model that can be used for Text Classification. This SetFit model uses mini1013/master_domain as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
2
  • '이니스프리 노세범 선쿠션 SPF50+ PA++++ 14g × 2개 (#M)위메프 > 뷰티 > 메이크업 > 베이스 메이크업 > 파우더/팩트 위메프 > 뷰티 > 메이크업 > 베이스 메이크업 > 파우더/팩트'
  • '스킨 세팅 톤업 선 쿠션(리필포함) + 추가구성품 톤업 선 쿠션 LotteOn > 백화점 > 뷰티 > 상단 배너 (Mobile) LotteOn > 뷰티 > 메이크업 > 베이스메이크업 > 쿠션/팩트'
  • '이니스프리 노세범 선쿠션 리필 14g 1 +1 (#M)쿠팡 홈>뷰티>스킨케어>선케어/태닝>선케어>선스틱 Coupang > 뷰티 > 로드샵 > 스킨케어 > 선케어/태닝'
1
  • 'SUNDANCE 썬댄스 햇빛 차단+태닝 선스프레이 LSF 50, 200ml ssg > 뷰티 > 스킨케어 > 선케어 > 선스프레이 ssg > 뷰티 > 스킨케어 > 선케어 > 선스프레이'
  • '리더스 여름자외선 썬버디 올 오버 선 스프레이 180ml MinSellAmount (#M)화장품/향수>선케어>선스프레이 Gmarket > 뷰티 > 화장품/향수 > 선케어 > 선스프레이'
  • '온더바디 헬로키티 에코 썬 스프레이 120ml+120ml 기획세트 (#M)홈>화장품/미용>선케어>선케어세트 Naverstore > 화장품/미용 > 선케어 > 선케어세트'
0
  • '[피지오겔] [정가 85,000원] 레드 수딩 AI 에어리 썬스틱 1+1 특별기획 롯데홈쇼핑 > 뷰티 > 남성화장품 LotteOn > 뷰티 > 남성화장품 > 선크림'
  • '[빌리프][2106] 해피 보 이지워시 선스틱 18g 세트(타임스퀘어점패션관) (#M)11st>선케어>선밤>선밤 11st > 뷰티 > 선케어 > 선밤 > 선밤'
  • '피지오겔 레드 수딩 AI 에어리 썬스틱 7g 1+1(2개) (#M)홈>스킨케어>선케어 HMALL > 뷰티 > 스킨케어 > 선케어'
4
  • '오스트레일리안골드 헴프네이션 오리지널 탠 익스텐더 바디로션 535ml (#M)SSG.COM/스킨케어/선케어/태닝 ssg > 뷰티 > 스킨케어 > 선케어 > 태닝'
  • '수딩앤모이스처 알로에베라92%수딩젤300ml (#M)홈>화장품/미용>바디케어>바디로션 Naverstore > 화장품/미용 > 바디케어 > 바디로션'
  • '세인트 트로페즈 셀프 탠 익스프레스 어드밴스드 브론징 무스 200ml (#M)SSG.COM/스킨케어/선케어/태닝 ssg > 뷰티 > 스킨케어 > 선케어 > 태닝'
3
  • '[맥퀸뉴욕] 1+ 1 UV 데일리 모이스처(수분) 선크림 1+1 UV 데일리 모이스처 선크림 (#M)SSG.COM/메이크업/립메이크업/립글로스 ssg > 뷰티 > 메이크업 > 아이메이크업 > 아이라이너'
  • '[공식] 더마비 10주년 바디로션/기획세트/멀티오일/프레쉬/크림/워시 1+1 S11.(애브리데이) 대용량 선블록 200ml×2개_S1.튜브견본(랜덤) 쇼킹딜 홈;쇼킹딜 홈>뷰티>바디/향수>바디케어;11st>뷰티>바디/향수>바디케어;11st>바디케어>바디로션>바디로션;11st > 뷰티 > 바디케어 > 바디로션 11st Hour Event > 패션/뷰티 > 뷰티 > 바디/향수 > 바디케어'
  • '[20%찜+T11%+묶음+당일 ] 롬앤 11번가 런칭! 모든 취향 취급 중! 밀크 그로서리 외 BEST 1+1 옵션31. 제로 선 클린 단품_01 프레쉬 쇼킹딜 홈>뷰티>선케어/메이크업>립/치크메이크업;11st>메이크업>립메이크업>립틴트;11st>뷰티>선케어/메이크업>립/치크메이크업;11st>뷰티>선케어/메이크업>아이메이크업;11st>메이크업>아이메이크업>마스카라;11st Hour Event > 패션/뷰티 > 뷰티 > 선케어/메이크업 > 립/치크메이크업 11st Hour Event > 패션/뷰티 > 뷰티 > 선케어/메이크업 > 아이메이크업'

Evaluation

Metrics

Label Accuracy
all 0.4903

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_bt_top8_test")
# Run inference
preds = model("엔프라니 옴므 선블록 썬크림 남성용 선크림  (#M)화장품/미용>남성화장품>선크림 Naverstore > 화장품/미용 > 남성화장품 > 선크림")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 11 21.656 72
Label Training Sample Count
0 50
1 50
2 50
3 50
4 50

Training Hyperparameters

  • batch_size: (64, 64)
  • num_epochs: (30, 30)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 100
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • l2_weight: 0.01
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0026 1 0.4513 -
0.1279 50 0.4435 -
0.2558 100 0.4063 -
0.3836 150 0.3413 -
0.5115 200 0.2997 -
0.6394 250 0.2434 -
0.7673 300 0.1724 -
0.8951 350 0.1334 -
1.0230 400 0.1078 -
1.1509 450 0.0997 -
1.2788 500 0.0937 -
1.4066 550 0.0933 -
1.5345 600 0.0909 -
1.6624 650 0.0897 -
1.7903 700 0.0842 -
1.9182 750 0.0741 -
2.0460 800 0.0764 -
2.1739 850 0.0745 -
2.3018 900 0.0733 -
2.4297 950 0.0748 -
2.5575 1000 0.0718 -
2.6854 1050 0.0568 -
2.8133 1100 0.0415 -
2.9412 1150 0.0256 -
3.0691 1200 0.0233 -
3.1969 1250 0.0128 -
3.3248 1300 0.0088 -
3.4527 1350 0.0066 -
3.5806 1400 0.0058 -
3.7084 1450 0.006 -
3.8363 1500 0.0058 -
3.9642 1550 0.0039 -
4.0921 1600 0.0043 -
4.2199 1650 0.0033 -
4.3478 1700 0.0059 -
4.4757 1750 0.0065 -
4.6036 1800 0.0061 -
4.7315 1850 0.0052 -
4.8593 1900 0.0054 -
4.9872 1950 0.0043 -
5.1151 2000 0.0064 -
5.2430 2050 0.0042 -
5.3708 2100 0.0046 -
5.4987 2150 0.0038 -
5.6266 2200 0.0031 -
5.7545 2250 0.0021 -
5.8824 2300 0.0006 -
6.0102 2350 0.0003 -
6.1381 2400 0.0001 -
6.2660 2450 0.0002 -
6.3939 2500 0.0 -
6.5217 2550 0.0 -
6.6496 2600 0.0001 -
6.7775 2650 0.0 -
6.9054 2700 0.0 -
7.0332 2750 0.0 -
7.1611 2800 0.0 -
7.2890 2850 0.0 -
7.4169 2900 0.0 -
7.5448 2950 0.0 -
7.6726 3000 0.0 -
7.8005 3050 0.0 -
7.9284 3100 0.0 -
8.0563 3150 0.0 -
8.1841 3200 0.0 -
8.3120 3250 0.0 -
8.4399 3300 0.0 -
8.5678 3350 0.0 -
8.6957 3400 0.0 -
8.8235 3450 0.0 -
8.9514 3500 0.0 -
9.0793 3550 0.0 -
9.2072 3600 0.0 -
9.3350 3650 0.0 -
9.4629 3700 0.0 -
9.5908 3750 0.0 -
9.7187 3800 0.0 -
9.8465 3850 0.0 -
9.9744 3900 0.0 -
10.1023 3950 0.0 -
10.2302 4000 0.0 -
10.3581 4050 0.0 -
10.4859 4100 0.0 -
10.6138 4150 0.0 -
10.7417 4200 0.0 -
10.8696 4250 0.0 -
10.9974 4300 0.0 -
11.1253 4350 0.0 -
11.2532 4400 0.0 -
11.3811 4450 0.0 -
11.5090 4500 0.0 -
11.6368 4550 0.0 -
11.7647 4600 0.0 -
11.8926 4650 0.0 -
12.0205 4700 0.0 -
12.1483 4750 0.0 -
12.2762 4800 0.0 -
12.4041 4850 0.0 -
12.5320 4900 0.0 -
12.6598 4950 0.0 -
12.7877 5000 0.0 -
12.9156 5050 0.0 -
13.0435 5100 0.0 -
13.1714 5150 0.0 -
13.2992 5200 0.0 -
13.4271 5250 0.0 -
13.5550 5300 0.0 -
13.6829 5350 0.0 -
13.8107 5400 0.0 -
13.9386 5450 0.0 -
14.0665 5500 0.0 -
14.1944 5550 0.0 -
14.3223 5600 0.0 -
14.4501 5650 0.0 -
14.5780 5700 0.0 -
14.7059 5750 0.0 -
14.8338 5800 0.0 -
14.9616 5850 0.0 -
15.0895 5900 0.0 -
15.2174 5950 0.0 -
15.3453 6000 0.0 -
15.4731 6050 0.0 -
15.6010 6100 0.0 -
15.7289 6150 0.0 -
15.8568 6200 0.0 -
15.9847 6250 0.0 -
16.1125 6300 0.0 -
16.2404 6350 0.0 -
16.3683 6400 0.0 -
16.4962 6450 0.0 -
16.6240 6500 0.0 -
16.7519 6550 0.0 -
16.8798 6600 0.0 -
17.0077 6650 0.0 -
17.1355 6700 0.0 -
17.2634 6750 0.0 -
17.3913 6800 0.0 -
17.5192 6850 0.0 -
17.6471 6900 0.0 -
17.7749 6950 0.0 -
17.9028 7000 0.0 -
18.0307 7050 0.0 -
18.1586 7100 0.0 -
18.2864 7150 0.0 -
18.4143 7200 0.0 -
18.5422 7250 0.0 -
18.6701 7300 0.0 -
18.7980 7350 0.0 -
18.9258 7400 0.0 -
19.0537 7450 0.0 -
19.1816 7500 0.0 -
19.3095 7550 0.0 -
19.4373 7600 0.0 -
19.5652 7650 0.0 -
19.6931 7700 0.0 -
19.8210 7750 0.0 -
19.9488 7800 0.0 -
20.0767 7850 0.0 -
20.2046 7900 0.0 -
20.3325 7950 0.0 -
20.4604 8000 0.0 -
20.5882 8050 0.0 -
20.7161 8100 0.0 -
20.8440 8150 0.0 -
20.9719 8200 0.0 -
21.0997 8250 0.0 -
21.2276 8300 0.0 -
21.3555 8350 0.0 -
21.4834 8400 0.0 -
21.6113 8450 0.0 -
21.7391 8500 0.0 -
21.8670 8550 0.0 -
21.9949 8600 0.0 -
22.1228 8650 0.0 -
22.2506 8700 0.0 -
22.3785 8750 0.0 -
22.5064 8800 0.0 -
22.6343 8850 0.0 -
22.7621 8900 0.0 -
22.8900 8950 0.0 -
23.0179 9000 0.0 -
23.1458 9050 0.0 -
23.2737 9100 0.0 -
23.4015 9150 0.0 -
23.5294 9200 0.0 -
23.6573 9250 0.0 -
23.7852 9300 0.0 -
23.9130 9350 0.0 -
24.0409 9400 0.0 -
24.1688 9450 0.0 -
24.2967 9500 0.0 -
24.4246 9550 0.0 -
24.5524 9600 0.0 -
24.6803 9650 0.0 -
24.8082 9700 0.0 -
24.9361 9750 0.0 -
25.0639 9800 0.0 -
25.1918 9850 0.0 -
25.3197 9900 0.0 -
25.4476 9950 0.0 -
25.5754 10000 0.0 -
25.7033 10050 0.0 -
25.8312 10100 0.0 -
25.9591 10150 0.0 -
26.0870 10200 0.0 -
26.2148 10250 0.0 -
26.3427 10300 0.0 -
26.4706 10350 0.0 -
26.5985 10400 0.0 -
26.7263 10450 0.0 -
26.8542 10500 0.0 -
26.9821 10550 0.0 -
27.1100 10600 0.0 -
27.2379 10650 0.0 -
27.3657 10700 0.0 -
27.4936 10750 0.0 -
27.6215 10800 0.0 -
27.7494 10850 0.0 -
27.8772 10900 0.0 -
28.0051 10950 0.0 -
28.1330 11000 0.0 -
28.2609 11050 0.0 -
28.3887 11100 0.0 -
28.5166 11150 0.0 -
28.6445 11200 0.0 -
28.7724 11250 0.0 -
28.9003 11300 0.0 -
29.0281 11350 0.0 -
29.1560 11400 0.0 -
29.2839 11450 0.0 -
29.4118 11500 0.0 -
29.5396 11550 0.0 -
29.6675 11600 0.0 -
29.7954 11650 0.0 -
29.9233 11700 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.1.0
  • Sentence Transformers: 3.3.1
  • Transformers: 4.44.2
  • PyTorch: 2.2.0a0+81ea7a4
  • Datasets: 3.2.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
0
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mini1013/master_cate_bt_top8_test

Base model

klue/roberta-base
Finetuned
(131)
this model

Evaluation results