Model Card for MediNote-7B-v1.0
MediNote is a suite of open-source medical Large Language Models (LLMs) fine-tuned for clinical note generation from the Meditron foundation model. MediNote-7B is a 7 billion parameters model trained to generate clinical notes from doctor-patient conversations.
Model Details
- Developed by: Antoine Bonnet and Paul Boulenger
- Model type: Causal decoder-only transformer language model
- Language(s): English only
- Model License: LLAMA 2 COMMUNITY LICENSE AGREEMENT
- Code License: MIT
- Fine-tuned from model: Meditron-7B.v1.0
- Context length: 2K tokens
- Input: Patient-doctor conversation transcripts (text)
- Output: Clinical notes (text)
- Repository: EPFL-IC-Make-Team/ClinicalNotes
- Trainer: epflLLM/Megatron-LLM
- Report: MediNote: Automatic Clinical Notes
Uses
Direct Use
It is possible to use this model to generate clinical notes, which is useful for experimentation and understanding its capabilities. It should not be used directly for production or work that may impact people.
Out-of-Scope Use
This model is not yet robust enough for use in a real clinical setting. We do not recommend using this model for natural language generation in a production environment.
- Downloads last month
- 336
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.