Itโs 2nd of December , hereโs your Cyber Monday present ๐ !
Weโre cutting our price down on Hugging Face Inference Endpoints and Spaces!
Our folks at Google Cloud are treating us with a 40% price cut on GCP Nvidia A100 GPUs for the next 3๏ธโฃ months. We have other reductions on all instances ranging from 20 to 50%.
if you use Google Kubernetes Engine to host you ML workloads, I think this series of videos is a great way to kickstart your journey of deploying LLMs, in less than 10 minutes! Thank you @wietse-venema-demo !
I'd like to share here a bit more about our Deep Learning Containers (DLCs) we built with Google Cloud, to transform the way you build AI with open models on this platform!
With pre-configured, optimized environments for PyTorch Training (GPU) and Inference (CPU/GPU), Text Generation Inference (GPU), and Text Embeddings Inference (CPU/GPU), the Hugging Face DLCs offer:
โก Optimized performance on Google Cloud's infrastructure, with TGI, TEI, and PyTorch acceleration. ๐ ๏ธ Hassle-free environment setup, no more dependency issues. ๐ Seamless updates to the latest stable versions. ๐ผ Streamlined workflow, reducing dev and maintenance overheads. ๐ Robust security features of Google Cloud. โ๏ธ Fine-tuned for optimal performance, integrated with GKE and Vertex AI. ๐ฆ Community examples for easy experimentation and implementation. ๐ TPU support for PyTorch Training/Inference and Text Generation Inference is coming soon!