--- license: mit language: - pt pipeline_tag: fill-mask tags: - medialbertina-ptpt - deberta - portuguese - european portuguese - medical - clinical - healthcare - encoder widget: - text: "Febre e tosse são sintomas comuns de [MASK]" example_title: "Example 1" - text: "Diabetes [MASK] tipo II" example_title: "Example 2" - text: "Utente tolera dieta [MASK] / Nivel de glicémia bom." example_title: "Example 3" - text: "Doente com administração de [MASK] com tramal." example_title: "Example 4" - text: "Colocada sonda de gases por apresentar [MASK] timpanizado" example_title: "Example 5" - text: "Conectada em PRVC com necessidade de aumentar [MASK] para 70%" example_title: "Example 6" - text: "Medicado com [MASK] em dias alternados." example_title: "Example 7" - text: "Realizado teste de [MASK] ao paciente" example_title: "Example 8" - text: "Sintomas apontam para COVID [MASK]." example_title: "Example 9" - text: "Durante internamento fez [MASK] fresco congelado 3x dia" example_title: "Example 10" - text: "Pupilas iso [MASK]." example_title: "Example 11" - text: "Cardiopatia [MASK] - causa provável: HAS" example_title: "Example 12" - text: "O paciente encontra-se [MASK] estável." example_title: "Example 13" - text: "Traumatismo [MASK] após acidente de viação." example_title: "Example 14" - text: "Analgesia com morfina em perfusão (15 [MASK]/kg/h)" example_title: "Example 15" --- # MediAlbertina The first publicly available medical language models trained with real European Portuguese data. MediAlbertina is a family of encoders from the Bert family, DeBERTaV2-based, resulting from the continuation of the pre-training of [PORTULAN's Albertina](https://huggingface.co/PORTULAN) models with Electronic Medical Records shared by Portugal's largest public hospital. Like its antecessors, MediAlbertina models are distributed under the [MIT license](https://huggingface.co/portugueseNLP/medialbertina_pt-pt_900m/blob/main/LICENSE). # Model Description MediAlbertina PT-PT 900M was created through domain adaptation of [Albertina PT-PT 900M](https://huggingface.co/PORTULAN/albertina-900m-portuguese-ptpt-encoder) on real European Portuguese EMRs by employing masked language modeling. It underwent evaluation through fine-tuning for the Information Extraction (IE) tasks Named Entity Recognition (NER) and Assertion Status (AStatus) on more than 10k manually annotated entities belonging to the following classes: Diagnosis, Symptom, Vital Sign, Result, Medical Procedure, Medication, Dosage, and Progress. In both tasks, MediAlbertina achieved superior results to its antecessors, demonstrating the effectiveness of this domain adaptation, and its potential for medical AI in Portugal. | Model | NER single-model | NER multi-models | Assertion Status | |-------------------------|:----------------:|:----------------:|:----------------:| | | F1-score | F1-score | F1-score | |albertina-900m-portuguese-ptpt-encoder | 0.813 | 0.811 | 0.687 | | **medialbertina_pt-pt_900m** | **0.832** | **0.848** | **0.755** | ## Data MediAlbertina PT-PT 900M was trained on more than 15M sentences and 300M tokens from 2.6M fully anonymized and unique Electronic Medical Records (EMRs) from Portugal's largest public hospital. This data was acquired under the framework of the [FCT project DSAIPA/AI/0122/2020 AIMHealth-Mobile Applications Based on Artificial Intelligence](https://ciencia.iscte-iul.pt/projects/aplicacoes-moveis-baseadas-em-inteligencia-artificial-para-resposta-de-saude-publica/1567). ## How to use ```Python from transformers import pipeline unmasker = pipeline('fill-mask', model='portugueseNLP/medialbertina_pt-pt_900m') unmasker("Analgesia com morfina em perfusão (15 [MASK]/kg/h)") ``` ## Citation MediAlbertina is developed by a joint team from [ISCTE-IUL](https://www.iscte-iul.pt/), Portugal, and [Select Data](https://selectdata.com/), CA USA. For a fully detailed description, check the respective publication: ```latex @article{MediAlbertina PT-PT, title={MediAlbertina: An European Portuguese medical language model}, author={Miguel Nunes and João Boné and João Ferreira and Pedro Chaves and Luís Elvas}, year={2024}, journal={CBM}, volume={182} url={https://doi.org/10.1016/j.compbiomed.2024.109233} } ``` Please use the above cannonical reference when using or citing this [model](https://www.sciencedirect.com/science/article/pii/S0010482524013180?via%3Dihub). ## Acknowledgements This work was financially supported by Project Blockchain.PT – Decentralize Portugal with Blockchain Agenda, (Project no 51), WP2, Call no 02/C05-i01.01/2022, funded by the Portuguese Recovery and Resillience Program (PRR), The Portuguese Republic and The European Union (EU) under the framework of Next Generation EU Program.