HamidRezaAttar commited on
Commit
597193d
1 Parent(s): d3eae96

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -7
README.md CHANGED
@@ -9,9 +9,9 @@ widget:
9
  - text: "Our plush and luxurious Emmett modular sofa brings custom comfort to your living space."
10
  ---
11
 
12
- ## HomeGPT2
13
 
14
- This model is fine-tuned using GPT-2 on amazon products metadata.
15
  It can generate descriptions for your **home** products by getting a text prompt.
16
 
17
  ### Model description
@@ -19,10 +19,13 @@ It can generate descriptions for your **home** products by getting a text prompt
19
 
20
  [GPT-2](https://openai.com/blog/better-language-models/) is a large [transformer](https://arxiv.org/abs/1706.03762)-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.
21
 
 
 
 
22
  ### How to use
23
- For best experience and clean outputs, please use the notebook mentioned in my [GitHub](https://github.com/HamidRezaAttar/gpt-2-home-product-description-generation)
24
 
25
- Also, you can use this model directly with a pipeline for text generation.
26
  ```python
27
  >>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
28
  >>> tokenizer = AutoTokenizer.from_pretrained("HamidRezaAttar/gpt2-product-description-generator")
@@ -33,12 +36,12 @@ Also, you can use this model directly with a pipeline for text generation.
33
 
34
  ### Citation info
35
  ```bibtex
36
- @misc{HomeGPT2,
37
  author = {HamidReza Fatollah Zadeh Attar},
38
- title = {HomeGPT2 the English product description generator},
39
  year = {2021},
40
  publisher = {GitHub},
41
  journal = {GitHub repository},
42
- howpublished = {\url{https://github.com/HamidRezaAttar/gpt-2-home-product-description-generation}},
43
  }
44
  ```
 
9
  - text: "Our plush and luxurious Emmett modular sofa brings custom comfort to your living space."
10
  ---
11
 
12
+ ## GPT2-Home
13
 
14
+ This model is fine-tuned using GPT-2 on amazon home products metadata.
15
  It can generate descriptions for your **home** products by getting a text prompt.
16
 
17
  ### Model description
 
19
 
20
  [GPT-2](https://openai.com/blog/better-language-models/) is a large [transformer](https://arxiv.org/abs/1706.03762)-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.
21
 
22
+ ### Live Demo
23
+ For testing model with special configuration, please visit [Demo](https://huggingface.co/spaces/HamidRezaAttar/gpt2-home)
24
+
25
  ### How to use
26
+ For best experience and clean outputs, you can use Live Demo mentioned above, also you can use the notebook mentioned in my [GitHub](https://github.com/HamidRezaAttar/GPT2-Home)
27
 
28
+ You can use this model directly with a pipeline for text generation.
29
  ```python
30
  >>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
31
  >>> tokenizer = AutoTokenizer.from_pretrained("HamidRezaAttar/gpt2-product-description-generator")
 
36
 
37
  ### Citation info
38
  ```bibtex
39
+ @misc{GPT2-Home,
40
  author = {HamidReza Fatollah Zadeh Attar},
41
+ title = {GPT2-Home the English home product description generator},
42
  year = {2021},
43
  publisher = {GitHub},
44
  journal = {GitHub repository},
45
+ howpublished = {\url{https://github.com/HamidRezaAttar/GPT2-Home}},
46
  }
47
  ```