Xianjun commited on
Commit
7ebde84
1 Parent(s): e6ef2d4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -3
README.md CHANGED
@@ -18,7 +18,7 @@ This model is optimized for plant science by continuing pertaining on over 1.5 m
18
  - **Demo [optional]:** [More Information Needed]
19
 
20
  ## How to Get Started with the Model
21
-
22
  from transformers import LlamaTokenizer, LlamaForCausalLM
23
  import torch
24
 
@@ -30,13 +30,18 @@ batch = tokenizer(instruction, return_tensors="pt", add_special_tokens=False).to
30
  with torch.no_grad():
31
  output = model.generate(**batch, max_new_tokens=512, temperature=0.7, do_sample=True)
32
  response = tokenizer.decode(output[0], skip_special_tokens=True)
 
33
 
 
 
34
 
35
- ## Citation [@inproceedings{Yang2024PLLaMaAO,
 
36
  title={PLLaMa: An Open-source Large Language Model for Plant Science},
37
  author={Xianjun Yang and Junfeng Gao and Wenxin Xue and Erik Alexandersson},
38
  year={2024},
39
  url={https://api.semanticscholar.org/CorpusID:266741610}
40
- }]
 
41
 
42
 
 
18
  - **Demo [optional]:** [More Information Needed]
19
 
20
  ## How to Get Started with the Model
21
+ ```python
22
  from transformers import LlamaTokenizer, LlamaForCausalLM
23
  import torch
24
 
 
30
  with torch.no_grad():
31
  output = model.generate(**batch, max_new_tokens=512, temperature=0.7, do_sample=True)
32
  response = tokenizer.decode(output[0], skip_special_tokens=True)
33
+ ```
34
 
35
+ ## Citation
36
+ If you find PLLaMa useful in your research, please cite the following paper:
37
 
38
+ ```latex
39
+ @inproceedings{Yang2024PLLaMaAO,
40
  title={PLLaMa: An Open-source Large Language Model for Plant Science},
41
  author={Xianjun Yang and Junfeng Gao and Wenxin Xue and Erik Alexandersson},
42
  year={2024},
43
  url={https://api.semanticscholar.org/CorpusID:266741610}
44
+ }
45
+ ```
46
 
47