Optimum Neuron Container
We provide pre-built Optimum Neuron containers for Amazon SageMaker. These containers come with all of the Hugging Face libraries and dependencies pre-installed, so you can start using them right away. We have containers for training and inference, and optimized text generation containers with TGI. The table is up to date and only includes the latest versions of each container. You can find older versions in the Deep Learning Container Release Notes
We recommend using the sagemaker
Python SDK to retrieve the image URI for the container you want to use. Here is a code snippet to retrieve the latest Text Generation Inference container Image URI:
from sagemaker.huggingface import get_huggingface_llm_image_uri
# retrieve the llm image uri
llm_image = get_huggingface_llm_image_uri(
"huggingface-neuronx"
)
print(f"llm image uri: {llm_image}")
Available Optimum Neuron Containers
Type | Optimum Version | Image URI |
---|---|---|
Training | 0.0.25 | 763104351884.dkr.ecr.us-west-2.amazonaws.com/huggingface-pytorch-training-neuronx:2.1.2-transformers4.43.2-neuronx-py310-sdk2.20.0-ubuntu20.04 |
Inference | 0.0.25 | 763104351884.dkr.ecr.us-west-2.amazonaws.com/huggingface-pytorch-inference-neuronx:2.1.2-transformers4.43.2-neuronx-py310-sdk2.20.0-ubuntu20.04 |
Text Generation Inference | 0.0.25 | 763104351884.dkr.ecr.us-west-2.amazonaws.com/huggingface-pytorch-tgi-inference:2.1.2-optimum0.0.25-neuronx-py310-ubuntu22.04 |
Please replace 763104351884
with the correct AWS account ID and region
with the AWS region you are working in.