Edit model card

Fairseq-dense 13B - Nerys

Model Description

Fairseq-dense 13B-Nerys is a finetune created using Fairseq's MoE dense model.

Training data

The training data contains around 2500 ebooks in various genres (the "Pike" dataset), a CYOA dataset called "CYS" and 50 Asian "Light Novels" (the "Manga-v1" dataset). Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]

Revision changes:

  • Removed all headers
  • Removed all whitespaces
  • Some reshuffling of data

How to use

You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:

>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='KoboldAI/fairseq-dense-13B-Nerys')
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]

Limitations and Biases

Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).

BibTeX entry and citation info

Artetxe et al. (2021): Efficient Large Scale Language Modeling with Mixtures of Experts
Downloads last month
127
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.