gemma-2b / README.md
bmah-dmx's picture
Upload README.md with huggingface_hub
b0e7cb9 verified
metadata
model-index:
  - name: gemma-2b
    results:
      - task:
          type: text-generation
        dataset:
          name: Wikitext
          type: wikitext
        metrics:
          - type: perplexity (BASELINE)
            value: 42.85221449187819
          - type: perplexity (BASIC)
            value: 207.45720773419006

This is a d-Matrix functional reference of the GEMMA-2B model. The reference provides the following functional configurations:

Configuration Explanation
BASELINE a reference functionally equivalent to the original model
BASIC all linear algebraic operands quantized to MXINT8-64, and all other operations transformed to approximated kernel simulations

Usage

Install d-Matrix Dmx_Compressor first.

pip install dmx_compressor

The following is an example model and its evaluation.

git clone https://github.com/EleutherAI/lm-evaluation-harness
cd lm-evaluation-harness
pip install -e .
from dmx.compressor.modeling import DmxModel
import lm_eval

model_args = "pretrained='d-matrix/gemma-2b',trust_remote_code=True"

lm = lm_eval.api.registry.get_model("hf").create_from_arg_string(model_args, {"batch_size": 1})

# Transform the model with DMX
lm._model = DmxModel.from_torch(lm._model).to_basic_model()  # Using BASIC configuration

eval_results = lm_eval.evaluate(lm, lm_eval.tasks.get_task_dict([task]))  # Assign desired task, i.e. "wikitext"