Model Card for CodeT5Summarization
Model Details
Model Description
- Model type: CodeT5+
- Number of parameters: 220M
- Programming Language: Python
- Finetuned from model: CodeT5+
Model Sources [optional]
- Repository: GitHub Repo
- Paper: "Leveraging Large Language Models in Code Question Answering: Baselines and Issues" Georgy Andryushchenko, Vladimir V. Ivanov, Vladimir Makharev, Elizaveta Tukhtina, Aidar Valeev
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.