ccore commited on
Commit
e791892
1 Parent(s): f86447e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +52 -33
README.md CHANGED
@@ -1,59 +1,78 @@
1
  ---
2
  license: other
3
- base_model: facebook/opt-1.3B
4
  tags:
5
  - generated_from_trainer
 
 
 
 
6
  metrics:
7
  - accuracy
8
- model-index:
9
- - name: mini3
10
- results: []
 
 
 
 
 
 
 
11
  ---
12
 
13
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
  should probably proofread and complete it, then remove this comment. -->
15
 
16
- # mini3
17
 
18
- This model is a fine-tuned version of [facebook/opt-1.3B](https://huggingface.co/facebook/opt-1.3B) on an unknown dataset.
19
- It achieves the following results on the evaluation set:
20
- - Loss: 4.5592
21
- - Accuracy: 0.4112
22
 
23
- ## Model description
24
 
25
- More information needed
26
 
27
- ## Intended uses & limitations
28
 
29
- More information needed
 
30
 
31
- ## Training and evaluation data
32
 
33
- More information needed
34
 
35
- ## Training procedure
 
36
 
37
- ### Training hyperparameters
38
 
39
- The following hyperparameters were used during training:
40
- - learning_rate: 0.0001
41
- - train_batch_size: 1
42
- - eval_batch_size: 8
43
- - seed: 42
44
- - gradient_accumulation_steps: 32
45
- - total_train_batch_size: 32
46
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
- - lr_scheduler_type: constant
48
- - num_epochs: 35.0
49
 
50
- ### Training results
51
 
 
 
52
 
 
 
53
 
54
- ### Framework versions
 
55
 
56
- - Transformers 4.34.0.dev0
57
- - Pytorch 2.0.1+cu117
58
- - Datasets 2.14.5
59
- - Tokenizers 0.14.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: other
3
+ base_model: facebook/opt-1.3b
4
  tags:
5
  - generated_from_trainer
6
+ - qa
7
+ - open data
8
+ - opt
9
+ - opt-1.3b
10
  metrics:
11
  - accuracy
12
+ widget:
13
+ - text: |-
14
+ # [PAPER]
15
+ Pope John Paul II (Latin: Ioannes Paulus II; Italian: Giovanni Paolo II; Polish: Jan Paweł II; born Karol Józef Wojtyła [ˈkarɔl ˈjuzɛv vɔjˈtɨwa];[b] 18 May 1920 – 2 April 2005) was head of the Catholic Church and sovereign of the Vatican City State from 1978 until his death in 2005. He was later canonised as Pope Saint John Paul II. In his youth, Wojtyła dabbled in stage acting. He graduated with excellent grades from an all-boys high school in Wadowice, Poland, shortly before the start of World War II in 1938. During the war, to avoid being kidnapped and sent off to a German slave labor camp, he signed up for work in harsh conditions in a quarry. Wojtyła eventually took up acting and developed a love for the profession and participated at a local theater. The linguistically skilled Wojtyła wanted to study Polish at university. Encouraged by a conversation with Adam Stefan Sapieha, he decided to study theology and become a priest. Eventually, Wojtyła rose to the position of Archbishop of Kraków and then a cardinal, both positions held by his mentor. Wojtyła was elected pope on the third day of the second papal conclave of 1978 (becoming one of the youngest popes in history), which was called after John Paul I, who had been elected in the first papal conclave of 1978 earlier in August to succeed Pope Paul VI, died after 33 days. Wojtyła adopted the name of his predecessor in tribute to him.[20] John Paul II was the first non-Italian pope since Adrian VI in the 16th century, as well as the third-longest-serving pope in history after Pius IX and St. Peter. John Paul II attempted to improve the Catholic Church's relations with Judaism, Islam, and the Eastern Orthodox Church in the spirit of ecumenism, holding atheism as the greatest threat. He maintained the Church's previous positions on such matters as abortion, artificial contraception, the ordination of women, and a celibate clergy, and although he supported the reforms of the Second Vatican Council, he was seen as generally conservative in their interpretation.[21][22] He put emphasis on family and identity, while questioning consumerism, hedonism and the pursuit of wealth. He was one of the most travelled world leaders in history, visiting 129 countries during his pontificate. As part of his special emphasis on the universal call to holiness, he beatified 1,344,[23] and also canonised 483 people, more than the combined tally of his predecessors during the preceding five centuries. By the time of his death, he had named most of the College of Cardinals, consecrated or co-consecrated many of the world's bishops, and ordained many priests.[24] He has been credited with fighting against dictatorships for democracy and with helping to end Communist rule in his native Poland and the rest of Europe.[25] Under John Paul II, the Catholic Church greatly expanded its influence in Africa and Latin America, and retained its influence in Europe and the rest of the world.
16
+
17
+ ## [UNDERSTANDING]
18
+ This section presents a brief account
19
+ datasets:
20
+ - ccore/open_data_understanding
21
+ pipeline_tag: text-generation
22
  ---
23
 
24
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
25
  should probably proofread and complete it, then remove this comment. -->
26
 
27
+ # OPT_1.3b_open_data_understanding
28
 
29
+ ## Description
 
 
 
30
 
31
+ This model has been trained to understand and respond to any content inserted after the `[PAPER]` tag. It uses advanced language modeling techniques to understand the context, structure, and underlying goals of the input text.
32
 
33
+ ## How to use
34
 
35
+ To interact with this template, place your text after the `[PAPER]` tag. The model will process the text and respond accordingly. For example:
36
 
37
+ [PAPER]
38
+ Your text here...
39
 
 
40
 
41
+ ## Example
42
 
43
+ [PAPER]
44
+ We present a scalable method to build a high-quality instruction-following language model...
45
 
 
46
 
47
+ The model will understand and respond to your text according to its context and content.
 
 
 
 
 
 
 
 
 
48
 
49
+ ## Comprehension Sections
50
 
51
+ ### [UNDERSTANDING]
52
+ This section provides a detailed analysis and decomposition of the inserted text, facilitating the understanding of the content.
53
 
54
+ ### [QUESTIONS AND ANSWERS]
55
+ This section addresses questions and answers that could arise based on the text provided.
56
 
57
+ ### [OBJECTION AND REPLY]
58
+ This section addresses any objections and responses that could arise from analysis of the text.
59
 
60
+ ## Common questions
61
+
62
+ - **What can this model do?**
63
+ - This model can understand and respond to any text placed after the `[PAPER]` tag.
64
+
65
+ - **Is a specific format necessary?**
66
+ - No, the model is quite flexible regarding the text format.
67
+
68
+ - **How does this model perform?**
69
+ - The model outperforms other LLaMa-based models on the Alpaca leaderboard, demonstrating a highly effective alignment.
70
+
71
+ ## Warnings
72
+
73
+ - This model was trained on a diverse corpus, but may still have bias or limitations.
74
+ - Continuous validation of the model and its output is essential.
75
+
76
+ ## Contact and Support
77
+
78
+ For more information, visit [Hugging Face](https://huggingface.co/).