File size: 1,439 Bytes
3893779
 
 
 
 
 
 
 
 
2009ed1
 
18a0c21
 
 
 
 
 
6fc11e4
18a0c21
 
 
2009ed1
d6140d2
 
 
 
d5dc66d
2009ed1
62d29da
2009ed1
6b2edfb
2009ed1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
title: README
emoji: πŸš€
colorFrom: indigo
colorTo: blue
sdk: static
pinned: false
---

# Foundation Model Stack

Foundation Model Stack (fms) is a collection of components developed out of IBM Research used for development, inference, training, and tuning of foundation models leveraging PyTorch native components. 

## Optimizations

In FMS, we aim to bring the latest optimizations for pre-training/inference/fine-tuning to all of our models. A few of these optimizations include, but are not limited to:

- fully compilable models with no graph breaks
- full tensor-parallel support for all applicable modules developed in fms
- training scripts leveraging FSDP
- state of the art light-weight speculators for improving inference performance

## Usage

FMS is currently being deployed in [Text Generation Inference Server](https://github.com/IBM/text-generation-inference)

## Repositories

- [foundation-model-stack](https://github.com/foundation-model-stack/foundation-model-stack): Main repository for which all fms models are based
- [fms-extras](https://github.com/foundation-model-stack/fms-extras): New features staged to be integrated with foundation-model-stack
- [fms-fsdp](https://github.com/foundation-model-stack/fms-fsdp): Pre-Training Examples using FSDP wrapped foundation models
- [fms-hf-tuning](https://github.com/foundation-model-stack/fms-hf-tuning): Basic Tuning scripts for fms models leveraging SFTTrainer