--- base_model: biggy-smiley/bert-base-uncased-fibe-ST library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: 146yearold brand more so with the 103 billion group generating 70 of its revenue from countries outside india this move is part of mistrys plan to make the salttosoftware conglomerate one of the top 25 mostadmired corporate brands globally how i can fit into any role that im asked to take up in a team game that is something ive always strived to achieve this is another challenge im looking forward to he stresses menswear retailer indian terrain which wasnt very aggressive on advertising plans to advertise on tv and increase its ad spends from 8 to 10 this year while panasonic is increasing its ad spends from rs 75 crore last year to rs 85 crore this year to tap the market - text: and the same could have been extended to commercial real estatethe move is expected to help realty developers to clear their unsold stock there are around 545000 unsold units across the top seven cities priced up to rs 15 crore while an additional 49290 units are priced between rs 15 crore to rs 25 crore said anuj puri chairman anarock property consultantsoffloading inventory will also help real estate developers mobilise the muchneeded liquidity in a significant way as the announcement will help accelerate the pace of sales of affordable and midpriced segmentsas of september end developers had a lockedin capital of nearly rs 3700 crore with unsold inventory to the tune of more than 450000 units at various stages of construction across the top 7 cities said ramesh nair ceo country head jll indiaaccording to tax and legal experts the relaxation for the sector by increasing the threshold to 20 for notified circle value comes with its own ridersthe changes proposed will be introduced by an amendment and it will be important to see how the changes are in fact introduced such that the intent is well drafted and incorporated in the law said pranay bhatia partner and leader tax and regulatory services bdo indiathe governmentsponsored special window for completion of construction of affordable and midincome housing projects swamih i has so far approved 135 projects with an outlay of rs 13200 crore said finance minister nirmala sitharaman while announcing the relief measures on thursdaythis investment through the sbicap venturesmanaged lastmile fund for stressed real estate projects will result in completion of 87000 stuck houses across the country - text: the latest episode of naagin 3 begins with sumitra being upset with bela and mahir sumitra asks mahir what would happen if yuvi comes in front of belamahir asks sumitra if bela will leave him but she is sure that bela wont leave him ever bela decides to tell mahir everything she sees someone and starts speaking to him thinking it is mahirshe says that she loves him and hugs him suddenly she sees mahir standing with vish she is shocked yuvi turns this leaves bela and vish stunned yuvi says that he loves belavish wonders how yuvi could be alive mahir is furious as he sees bela and yuvi togetherhe breaks a glass and hurts himself mahir is annoyed as he recalls bela hugging yuvi mahir lashes out at bela saying that his biggest mistake was loving herbela tells mahir she needs to share something with him but mahir asks her to go awaybela tries to stop him but he walks away angrily poulomi asks yuvi where he was he says that he went with raavi but they met with an accident raavi is now in comabela tries to call mahir vish walks in and asks bela what is happening bela confesses she is in love with mahiryuvi leaves in a car and vish follows him ajithab questions vish about her real name and she says that she is an undercover police agent she sees yuvi driving deep into the jungleyuvi walks into the temple and vish watches him she realises that yuvi is actually vikrant vikrant takes his original form and recalls moments of how he got saved vish returns home as she is unable to see anythingvikrant pledges for revenge as he wants to put an end to the life of men who tried to kill himvikrant says that he hates humans and needs the nagmani to rule them belas mother calls aghori baba she tells him she needs to go but aghori refuses to let her - text: mumbai the state education department has asked state boardaffiliated schools and junior colleges to set up a committee of subject teachers to evaluate students of classes ix and xi in a circular the department has directed that all students be promoted and that schools not mention any other remarks on report cardsguidelines issued by the state on friday mentioned that schools and junior colleges can conduct assessments that do no require students to be physically present on the premises of the educational institutea committee of subject teachers from the school must discuss the assessment form and other criteria schools and junior colleges must give marks to students based on the assessment report cards must not have semesterwise breakup of scores but instead mention performance as decided by the school said the circularwhile schools can evaluate students they can only use scores as an indicator for student performance and all students must be promoted to the next classthis year all students of classes i to ix and xi from state boardaffiliated schools will be promoted to the next class as most schools remained shut for most part of the year due to the covid19 pandemicwhile the state is yet to make a decision regarding the board examination for classes x and xii which are slated to begin later this month the demand to postpone or cancel the exams continued to grow - text: marvels midnight suns far and away the most underrated strategy game and possible just the most underappreciated game of 2022 more broadly is free to play from now until sunday march 26 at midnight pdttheres no better time than now to try marvels midnight suns play the game at no cost to you this weekend on playstation 5 and xbox series xs from now until mar 26 at 1159 pt pictwittercomt0arvnlilymarch 23 2023see moremaybe people are turned off by card games or maybe theyre burnt out by marvel games after that one from a few years back that rhymes with larvels revengers but the postlaunch hype around marvels midnight suns seems to have petered out pretty quickly and thats a shame because its really quite good our 45star midnight suns review praises the intricate turnbased combat system thats a marked departure from firaxiss earlier effort xcom as well as the rich histories and colorful personalities of its cast of heroes and the surprisingly deep gameplay elements tucked away at the abbey your heroes home basemarvels midnight suns works wonders to freshen up familiar marvel characters producing lively battles from focused turnbased systems then diving into their personalities and histories to reveal their intimate concerns reads our verdict the combat missions can feel a little sidelined by the sheer wealth of resource management tasks and relationship building but all the pieces serve a purpose within the richly detailed whole marvellous pretty muchits a good time to jump into midnight suns and not just because its free for the weekend firaxis has been dropping dlc characters at a steady clip and just this week morbius aka the living vampire was added to the roster you know what means and it rhymes with its blorbin thymecheck out our list of marvels midnight suns tips if you plan on playing soon inference: true model-index: - name: SetFit with biggy-smiley/bert-base-uncased-fibe-ST results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 0.9277428571428571 name: Accuracy --- # SetFit with biggy-smiley/bert-base-uncased-fibe-ST This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [biggy-smiley/bert-base-uncased-fibe-ST](https://huggingface.co/biggy-smiley/bert-base-uncased-fibe-ST) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [biggy-smiley/bert-base-uncased-fibe-ST](https://huggingface.co/biggy-smiley/bert-base-uncased-fibe-ST) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 26 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | news and politics | | | healthy living | | | music and audio | | | business and finance | | | pets | | | shopping | | | personal finance | | | food and drinks | | | technology and computing | | | books and literature | | | academic interests | | | family and relationships | | | hobbies and interests | | | travel | | | television | | | movies | | | automotives | | | real estate | | | video gaming | | | sports | | | home and garden | | | health | | | careers | | | arts and culture | | | pharmaceuticals, conditions, and symptoms | | | style and fashion | | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.9277 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("pilotj/setfit-bert-uncased-fibe") # Run inference preds = model("146yearold brand more so with the 103 billion group generating 70 of its revenue from countries outside india this move is part of mistrys plan to make the salttosoftware conglomerate one of the top 25 mostadmired corporate brands globally how i can fit into any role that im asked to take up in a team game that is something ive always strived to achieve this is another challenge im looking forward to he stresses menswear retailer indian terrain which wasnt very aggressive on advertising plans to advertise on tv and increase its ad spends from 8 to 10 this year while panasonic is increasing its ad spends from rs 75 crore last year to rs 85 crore this year to tap the market") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:----| | Word count | 96 | 309.1020 | 500 | | Label | Training Sample Count | |:------------------------------------------|:----------------------| | academic interests | 3 | | arts and culture | 3 | | automotives | 3 | | books and literature | 3 | | business and finance | 7 | | careers | 2 | | family and relationships | 5 | | food and drinks | 3 | | health | 3 | | healthy living | 3 | | hobbies and interests | 3 | | home and garden | 3 | | movies | 5 | | music and audio | 3 | | news and politics | 3 | | personal finance | 7 | | pets | 3 | | pharmaceuticals, conditions, and symptoms | 2 | | real estate | 7 | | shopping | 3 | | sports | 3 | | style and fashion | 3 | | technology and computing | 6 | | television | 3 | | travel | 6 | | video gaming | 3 | ### Training Hyperparameters - batch_size: (16, 16) - num_epochs: (5, 5) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - l2_weight: 0.01 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: False ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.0017 | 1 | 0.0564 | - | | 0.0871 | 50 | 0.0843 | - | | 0.1742 | 100 | 0.0729 | - | | 0.2613 | 150 | 0.0352 | - | | 0.3484 | 200 | 0.0242 | - | | 0.4355 | 250 | 0.0135 | - | | 0.5226 | 300 | 0.006 | - | | 0.6098 | 350 | 0.003 | - | | 0.6969 | 400 | 0.0013 | - | | 0.7840 | 450 | 0.0009 | - | | 0.8711 | 500 | 0.0008 | - | | 0.9582 | 550 | 0.0006 | - | | 1.0453 | 600 | 0.0006 | - | ### Framework Versions - Python: 3.10.14 - SetFit: 1.1.0 - Sentence Transformers: 3.1.1 - Transformers: 4.44.2 - PyTorch: 2.4.0 - Datasets: 3.0.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```