--- base_model: sentence-transformers/paraphrase-mpnet-base-v2 datasets: - stanfordnlp/imdb library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: What does the " Executive producer " do in a movie . If I remember correctly it's the person who raised the financial backing to make the movie . You might notice in a great number of movies starring Sean Connery that he is also the executive producer which meant Connery himself raised the money since he is a major player . Unfortunately it should also be pointed out that a great number of movies " starring Sean Connery were solely made because he managed to raise the money since he's a major Hollywood player , it's usually an indication that when the credits read that the executive producer and the star of the movie are one and the same the movie itself is nothing more than a star vehicle with the story/screenplay not being up to scratch

PROTOCOL follows the saga of one Sunny Davis a kooky bimboesque cocktail waitress who saves a visiting dignitary and as a reward gets made a top diplomat . Likely ? As things progress Ms Davis ( Who has problems being able to string two sentences together ) finds herself in more outlandish and less likely situations . When I say that PROTOCOL stars Goldie Hawn who is also the film's executive producer do you understand what I'm saying about the story/screenplay not being up to scratch ? Exactly - text: I've seen all four of the movies in this series. Each one strays further and further from the books. This is the worst one yet. My problem is that it does not follow the book it is titled after in any way! The directors and producers should have named it any thing other than "Love's Abiding Joy." The only thing about this movie that remotely resembles the book are the names of some of the characters (Willie, Missie, Henry, Clark, Scottie and Cookie). The names/ages/genders of the children are wrong. The entire story line is no where in the book.

I find it a great disservice to Janette Oke, her books and her fans to produce a movie under her title that is not correct in any way. The music is too loud. The actors are not convincing - they lack emotions.

If you want a good family movie, this might do. It is clean. Don't watch it, though, if you are hoping for a condensed version of the book. I hope that this will be the last movie from this series, but I doubt it. If there are more movies made, I wish Michael Landon, Jr and others would stick closer to the original plot and story lines. The books are excellent and, if closely followed, would make excellent movies! - text: 'THE ZOMBIE CHRONICLES

Aspect ratio: 1.33:1 (Nu-View 3-D)

Sound format: Mono

Whilst searching for a (literal) ghost town in the middle of nowhere, a young reporter (Emmy Smith) picks up a grizzled hitchhiker (Joseph Haggerty) who tells her two stories involving flesh-eating zombies reputed to haunt the area.

An ABSOLUTE waste of time, hobbled from the outset by Haggerty''s painfully amateurish performance in a key role. Worse still, the two stories which make up the bulk of the running time are utterly routine, made worse by indifferent performances and lackluster direction by Brad Sykes, previously responsible for the likes of CAMP BLOOD (1999). This isn''t a ''fun'' movie in the sense that Ed Wood''s movies are ''fun'' (he, at least, believed in what he was doing and was sincere in his efforts, despite a lack of talent); Sykes'' home-made movies are, in fact, aggravating, boring and almost completely devoid of any redeeming virtue, and most viewers will feel justifiably angry and cheated by such unimaginative, badly-conceived junk. The 3-D format is utterly wasted here.' - text: If only to avoid making this type of film in the future. This film is interesting as an experiment but tells no cogent story.

One might feel virtuous for sitting thru it because it touches on so many IMPORTANT issues but it does so without any discernable motive. The viewer comes away with no new perspectives (unless one comes up with one while one's mind wanders, as it will invariably do during this pointless film).

One might better spend one's time staring out a window at a tree growing.

- text: Sexo Cannibal, or Devil Hunter as it's more commonly known amongst English speaking audiences, starts with actress & model Laura Crawford (Ursula Buchfellner as Ursula Fellner) checking out locations for her new film along with her assistant Jane (Gisela Hahn). After a long days work Laura is relaxing in the bath of her room when two very dubious character's named Chris (Werner Pochath) & Thomas (Antonio Mayans) burst in & kidnap her having been helped by the treacherous Jane. Laura's agent gets on the blower to rent-a-hero Peter Weston (Al Cliver) who is informed of the situation, the kidnappers have Laura on an isolated island & are demanding a 6 million ransom. Peter is told that he will be paid 200,000 to get her back safely & a further 10% of the 6 million if he brings that back as well, faster than a rat up a drain pipe Peter & his Vietnam Vet buddy helicopter pilot Jack are on the island & deciding on how to save Laura. So, the kidnappers have Laura & Peter has the 6 million but neither want to hand them over that much. Just to complicate things further this particular isolated island is home to a primitive tribe (hell, in all the generations they've lived there they've only managed to build one straw hut, now that's primitive) who worship some cannibal monster dude (Burt Altman) with bulging eyes as a God with human sacrifices & this cannibal has a liking for young, white female flesh & intestines...

This Spanish, French & German co-production was co-written & directed by the prolific Jesus Franco who also gets the credit for the music as well. Sexo Cannibal has gained a certain amount of notoriety here in the UK as it was placed on the 'Video Nasties' list in the early 80's under it's alternate Devil Hunter title & therefore officially classed as obscene & banned, having said that I have no idea why as it is one bad film & even Franco, who isn't afraid to be associated with a turkey, decides he wants to hide under the pseudonym of Clifford Brown. I'd imagine even the most die-hard Franco fan would have a hard time defending this thing. The script by Franco, erm sorry I mean Clifford Brown & Julian Esteban as Julius Valery who was obviously another one less than impressed with the finished product & wanted his named removed, is awful. It's as simple & straight forward as that. For a start the film is so boring it's untrue, the kidnap plot is one of the dullest I've ever seen without the slightest bit of tension or excitement involved & the horror side of things don't improve as we get a big black guy with stupid looking over-sized bloodshot eyes plus two tame cannibal scenes. As a horror film Sexo Cannibal fails & as an action adventure it has no more success, this is one to avoid.

Director Franco shows his usual incompetence throughout, a decapitated head is achieved by an actor lying on the ground with large leaves placed around the bottom of his neck to try & give the impression it's not attached to anything! The cannibal scenes are poor, the action is lame & it has endless scenes of people randomly walking around the jungle getting from 'A' to 'B' & not really doing anything when they get there either. It becomes incredibly dull & tedious to watch after about 10 minutes & don't forget this thing goes on for 94 minutes in it's uncut state. I also must mention the hilarious scene when Al Cliver is supposed to be climbing a cliff, this is achieved by Franco turning his camera on it's side & having Cliver crawl along the floor! Just look at the way his coat hangs & the way he never grabs onto to anything as he just pulls himself along! The gore isn't that great & as far as Euro cannibal films go this is very tame, there are some gross close ups of the cannibals mouth as it chews bits of meat, a man is impaled on spikes, there's some blood & a handful of intestines. There's a fair bit of nudity in Sexo Cannibal & an unpleasant rape scene.

Sexo Cannibal must have had a low budget & I mean low. This is a shoddy poorly made film with awful special effects & rock bottom production values. The only decent thing about it is the jungle setting which at least looks authentic. The music sucks & sound effects become annoying as there is lots of heavy breathing whenever the cannibal is on screen. The acting sucks, the whole thing was obviously dubbed anyway but no one in this thing can act.

Sexo Cannibal is a terrible film that commits the fatal mistake of being as boring as hell. The only good things I can say is that it has a certain sleazy atmosphere to it & those close ups of the cannibal chewing meat are pretty gross. Anyone looking for a decent cinematic experience should give Sexo Cannibal as wide a berth as possible, one to avoid. inference: true model-index: - name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2 results: - task: type: text-classification name: Text Classification dataset: name: stanfordnlp/imdb type: stanfordnlp/imdb split: test metrics: - type: accuracy value: 0.8242 name: Accuracy --- # SetFit with sentence-transformers/paraphrase-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model trained on the [stanfordnlp/imdb](https://huggingface.co/datasets/stanfordnlp/imdb) dataset that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 2 classes - **Training Dataset:** [stanfordnlp/imdb](https://huggingface.co/datasets/stanfordnlp/imdb) ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | | | 1 | | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.8242 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("setfit_model_id") # Run inference preds = model("If only to avoid making this type of film in the future. This film is interesting as an experiment but tells no cogent story.

One might feel virtuous for sitting thru it because it touches on so many IMPORTANT issues but it does so without any discernable motive. The viewer comes away with no new perspectives (unless one comes up with one while one's mind wanders, as it will invariably do during this pointless film).

One might better spend one's time staring out a window at a tree growing.

") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:----| | Word count | 48 | 244.4571 | 888 | | Label | Training Sample Count | |:------|:----------------------| | 1 | 7 | | 0 | 63 | ### Training Hyperparameters - batch_size: (16, 16) - num_epochs: (1, 1) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:-------:|:-------:|:-------------:|:---------------:| | 0.0039 | 1 | 0.2493 | - | | 0.1953 | 50 | 0.0016 | - | | 0.3906 | 100 | 0.0003 | - | | 0.5859 | 150 | 0.003 | - | | 0.7812 | 200 | 0.0014 | - | | 0.9766 | 250 | 0.0002 | - | | **1.0** | **256** | **-** | **0.4699** | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.8.19 - SetFit: 1.0.3 - Sentence Transformers: 3.0.1 - Transformers: 4.39.0 - PyTorch: 2.4.0 - Datasets: 2.20.0 - Tokenizers: 0.15.2 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```