hossboll commited on
Commit
f0a1eb6
1 Parent(s): 5a18ca8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -13
README.md CHANGED
@@ -5,6 +5,13 @@ tags:
5
  model-index:
6
  - name: clinical-t5
7
  results: []
 
 
 
 
 
 
 
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -12,21 +19,13 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # clinical-t5
14
 
15
- This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on an unknown dataset.
16
-
17
- ## Model description
18
-
19
- More information needed
20
 
21
  ## Intended uses & limitations
22
 
23
- More information needed
24
-
25
- ## Training and evaluation data
26
-
27
- More information needed
28
-
29
- ## Training procedure
30
 
31
  ### Training hyperparameters
32
 
@@ -44,4 +43,4 @@ The following hyperparameters were used during training:
44
  - Transformers 4.30.0
45
  - Pytorch 2.2.1+cu121
46
  - Datasets 2.19.1
47
- - Tokenizers 0.13.3
 
5
  model-index:
6
  - name: clinical-t5
7
  results: []
8
+ datasets:
9
+ - AGBonnet/augmented-clinical-notes
10
+ language:
11
+ - en
12
+ metrics:
13
+ - rouge
14
+ pipeline_tag: summarization
15
  ---
16
 
17
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
19
 
20
  # clinical-t5
21
 
22
+ This is a finetuned T5-small model from Google, a checkpoint with 60 million parameters, for clinical note summarization.
23
+ It was fine-tuned with the [augmented-clinical-notes](https://huggingface.co/datasets/AGBonnet/augmented-clinical-notes) dataset, available in the Hugging Face.
 
 
 
24
 
25
  ## Intended uses & limitations
26
 
27
+ The model was created for learning purposes. Hence, although being briefly evaluated in [this](https://github.com/hossboll/clinical_nlp/blob/main/clinical_t5_finetuned.ipynb
28
+ ) notebook, it should be further refined.
 
 
 
 
 
29
 
30
  ### Training hyperparameters
31
 
 
43
  - Transformers 4.30.0
44
  - Pytorch 2.2.1+cu121
45
  - Datasets 2.19.1
46
+ - Tokenizers 0.13.3