distilgpt2 / README.md
bmah-dmx's picture
Upload README.md with huggingface_hub
87e2483 verified
metadata
model-index:
  - name: distilgpt2
    results:
      - task:
          type: text-generation
        dataset:
          name: Wikitext
          type: wikitext
        metrics:
          - type: perplexity (BASELINE)
            value: 52.96194118952879
          - type: perplexity (BASIC)
            value: 56.66175009356436

This is a d-Matrix functional reference of the DISTILGPT2 model. The reference provides the following functional configurations:

Configuration Explanation
BASELINE a reference functionally equivalent to the original model
BASIC all linear algebraic operands quantized to MXINT8-64, and all other operations transformed to approximated kernel simulations

Usage

Install d-Matrix Dmx_Compressor first.

pip install dmx_compressor

The following is an example model and its evaluation.

git clone https://github.com/EleutherAI/lm-evaluation-harness
cd lm-evaluation-harness
pip install -e .
from dmx.compressor.modeling import DmxModel
import lm_eval

model_args = "pretrained='d-matrix/distilgpt2',trust_remote_code=True"

lm = lm_eval.api.registry.get_model("hf").create_from_arg_string(model_args, {"batch_size": 1})

# Transform the model with DMX
lm._model = DmxModel.from_torch(lm._model).to_basic_model()  # Using BASIC configuration

eval_results = lm_eval.evaluate(lm, lm_eval.tasks.get_task_dict([task]))  # Assign desired task, i.e. "wikitext"