File size: 650 Bytes
f8b864a
 
 
f8a5605
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
---
license: apache-2.0
---
# Summary

An instruction-following large language model based on [pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) and trained on [Databricks' 15k instruction](https://huggingface.co/datasets/databricks/databricks-dolly-15k) 
with capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation, information extraction, open QA and summarization.

This model is an experiment in using small base model ([pythia-70m](https://huggingface.co/EleutherAI/pythia-70m)) to build a model similar to Databricks' [dolly model](https://huggingface.co/databricks/dolly-v2-12b).