text-embeddings-inference documentation

Using TEI locally with Metal

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Using TEI locally with Metal

You can install text-embeddings-inference locally to run it on your own Mac with Metal support. Here are the step-by-step instructions for installation:

Step 1: Install Rust

Install Rust on your machine by run the following in your terminal, then following the instructions:

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

Step 2: Install with Metal support

cargo install --path router -F metal

Step 3: Launch Text Embeddings Inference

Once the installation is successfully complete, you can launch Text Embeddings Inference with Metal with the following command:

model=BAAI/bge-large-en-v1.5
revision=refs/pr/5

text-embeddings-router --model-id $model --revision $revision --port 8080

Now you are ready to use text-embeddings-inference locally on your machine.

< > Update on GitHub