MengniWang's picture
update
b6c9459
|
raw
history blame
895 Bytes
metadata
license: apache-2.0
datasets:
  - lambada
language:
  - en
library_name: transformers
pipeline_tag: text-generation
tags:
  - text-generation-inference
  - causal-lm
  - int8
  - ONNX
  - PostTrainingStatic
  - Intel® Neural Compressor
  - neural-compressor

Model Details

GPT-J 6B is a transformer model trained using Ben Wang's Mesh Transformer JAX. "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.

This int8 model is generated by neural-compressor and the fp32 model is from this repo.

How to use

Download the model and script by cloning the repository:

git clone https://huggingface.co/Intel/gpt-j-6B-int8-static`

Then you can do inference based on the model and script 'evaluation.ipynb'.