Maykeye
Initial commit: code w/o weights
be19c03
metadata
license: apache-2.0

Mamba Bit!

Mamba with vocab size 2 bites again! This time we bite at tiny stories. I didn't bother preprocess them at all, during a training model took random char offset, converted it to bit string and fed to mamba. This time I didn't forget about residual connections nor about norm. As the result model was trained in BF16.

Training code included.

Example to run a model from CLI:

$ python mambabit.py "Run, kitten, run"

Run, kitten, running and jumping. She saw a big tree and thought it would be fun to share the tree. So, she went to the tree and started to climb the tree. She saw a big tree and thought it would be fun to share the tree. So, she went to the tree and saw a big red ball.