miniDolly / README.md
marouni's picture
chore(): add a model summary
f8a5605
|
raw
history blame
No virus
650 Bytes
metadata
license: apache-2.0

Summary

An instruction-following large language model based on pythia-70m and trained on Databricks' 15k instruction with capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation, information extraction, open QA and summarization.

This model is an experiment in using small base model (pythia-70m) to build a model similar to Databricks' dolly model.