MedPaxTral-2x7b / README.md
skuma307's picture
Update README.md
7ff7d7f verified
metadata
license: apache-2.0
language:
  - en
library_name: transformers
pipeline_tag: text-generation
tags:
  - medical

A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful LLM.