Edit model card

BEE-spoke-data/beecoder-220M-python

This is BEE-spoke-data/smol_llama-220M-GQA fine-tuned for code generation on:

  • filtered version of stack-smol-XL
  • deduped version of 'algebraic stack' from proof-pile-2
  • cleaned and deduped pypi (last dataset)

This model (and the base model) were both trained using ctx length 2048.

examples

Example script for inference testing: here

It has its limitations at 220M, but seems decent for single-line or docstring generation, and/or being used for speculative decoding for such purposes.

image/png

The screenshot is on CPU on a laptop.


Downloads last month
12
Safetensors
Model size
218M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for BEE-spoke-data/beecoder-220M-python

Finetuned
(12)
this model
Quantizations
1 model

Datasets used to train BEE-spoke-data/beecoder-220M-python

Collection including BEE-spoke-data/beecoder-220M-python