julien-c HF staff commited on
Commit
68d3374
1 Parent(s): 8494ddd

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/jcblaise/distilbert-tagalog-base-cased/README.md

Files changed (1) hide show
  1. README.md +63 -0
README.md ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: tl
3
+ tags:
4
+ - distilbert
5
+ - bert
6
+ - tagalog
7
+ - filipino
8
+ license: gpl-3.0
9
+ inference: false
10
+ ---
11
+
12
+ # DistilBERT Tagalog Base Cased
13
+ Tagalog version of DistilBERT, distilled from [`bert-tagalog-base-cased`](https://huggingface.co/jcblaise/bert-tagalog-base-cased). This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community.
14
+
15
+ ## Usage
16
+ The model can be loaded and used in both PyTorch and TensorFlow through the HuggingFace Transformers package.
17
+
18
+ ```python
19
+ from transformers import TFAutoModel, AutoModel, AutoTokenizer
20
+
21
+ # TensorFlow
22
+ model = TFAutoModel.from_pretrained('jcblaise/distilbert-tagalog-base-cased', from_pt=True)
23
+ tokenizer = AutoTokenizer.from_pretrained('jcblaise/distilbert-tagalog-base-cased', do_lower_case=False)
24
+
25
+ # PyTorch
26
+ model = AutoModel.from_pretrained('jcblaise/distilbert-tagalog-base-cased')
27
+ tokenizer = AutoTokenizer.from_pretrained('jcblaise/distilbert-tagalog-base-cased', do_lower_case=False)
28
+ ```
29
+ Finetuning scripts and other utilities we use for our projects can be found in our centralized repository at https://github.com/jcblaisecruz02/Filipino-Text-Benchmarks
30
+
31
+ ## Citations
32
+ All model details and training setups can be found in our papers. If you use our model or find it useful in your projects, please cite our work:
33
+
34
+ ```
35
+ @inproceedings{localization2020cruz,
36
+ title={{Localization of Fake News Detection via Multitask Transfer Learning}},
37
+ author={Cruz, Jan Christian Blaise and Tan, Julianne Agatha and Cheng, Charibeth},
38
+ booktitle={Proceedings of The 12th Language Resources and Evaluation Conference},
39
+ pages={2589--2597},
40
+ year={2020},
41
+ url={https://www.aclweb.org/anthology/2020.lrec-1.315}
42
+ }
43
+
44
+ @article{cruz2020establishing,
45
+ title={Establishing Baselines for Text Classification in Low-Resource Languages},
46
+ author={Cruz, Jan Christian Blaise and Cheng, Charibeth},
47
+ journal={arXiv preprint arXiv:2005.02068},
48
+ year={2020}
49
+ }
50
+
51
+ @article{cruz2019evaluating,
52
+ title={Evaluating Language Model Finetuning Techniques for Low-resource Languages},
53
+ author={Cruz, Jan Christian Blaise and Cheng, Charibeth},
54
+ journal={arXiv preprint arXiv:1907.00409},
55
+ year={2019}
56
+ }
57
+ ```
58
+
59
+ ## Data and Other Resources
60
+ Data used to train this model as well as other benchmark datasets in Filipino can be found in my website at https://blaisecruz.com
61
+
62
+ ## Contact
63
+ If you have questions, concerns, or if you just want to chat about NLP and low-resource languages in general, you may reach me through my work email at jan_christian_cruz@dlsu.edu.ph