Spaces:
Running
Running
title: README | |
emoji: π | |
colorFrom: indigo | |
colorTo: blue | |
sdk: static | |
pinned: false | |
# Foundation Model Stack | |
Foundation Model Stack (fms) is a collection of components developed out of IBM Research used for development, inference, training, and tuning of foundation models leveraging PyTorch native components. | |
## Optimizations | |
In FMS, we aim to bring the latest optimizations for pre-training/inference/fine-tuning to all of our models. A few of these optimizations include, but are not limited to: | |
- fully compilable models with no graph breaks | |
- full tensor-parallel support for all applicable modules developed in fms | |
- training scripts leveraging FSDP | |
- state of the art light-weight speculators for improving inference performance | |
## Usage | |
FMS is currently being deployed in [Text Generation Inference Server](https://github.com/IBM/text-generation-inference) | |
## Repositories | |
- [foundation-model-stack](https://github.com/foundation-model-stack/foundation-model-stack): Main repository for which all fms models are based | |
- [fms-extras](https://github.com/foundation-model-stack/fms-extras): New features staged to be integrated with foundation-model-stack | |
- [fms-fsdp](https://github.com/foundation-model-stack/fms-fsdp): Pre-Training Examples using FSDP wrapped foundation models | |
- [fms-hf-tuning](https://github.com/foundation-model-stack/fms-hf-tuning): Basic Tuning scripts for fms models leveraging SFTTrainer | |