--- widget: - text: "I believe I will get into UW. I will get into UW." --- This is an off-the-shelf roberta-large model finetuned on WANLI, the Worker-AI Collaborative NLI dataset ([Liu et al., 2022](https://arxiv.org/abs/2201.05955)). It outperforms the `roberta-large-mnli` model on seven out-of-domain test sets, including by 11% on HANS and 9% on Adversarial NLI. ### How to use ``` from transformers import RobertaTokenizer, RobertaForSequenceClassification model = RobertaForSequenceClassification.from_pretrained('alisawuffles/roberta-large-wanli') tokenizer = RobertaTokenizer.from_pretrained('alisawuffles/roberta-large-wanli') x = tokenizer("I believe I will get into UW.", "I will get into UW.", hypothesis, return_tensors='pt', max_length=128, truncation=True) logits = model(**x).logits probs = logits.softmax(dim=1).squeeze(0) label_id = torch.argmax(probs).item() prediction = model.config.id2label[label_id] ``` ### Citation ``` @misc{liu-etal-2022-wanli, title = "WANLI: Worker and AI Collaboration for Natural Language Inference Dataset Creation", author = "Liu, Alisa and Swayamdipta, Swabha and Smith, Noah A. and Choi, Yejin", month = jan, year = "2022", url = "https://arxiv.org/pdf/2201.05955", } ```