PeterBrendan commited on
Commit
92e3bd8
·
1 Parent(s): 27f3bf2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -12,7 +12,7 @@ widget:
12
 
13
  **Model name:** pbjs_gpt2
14
 
15
- **Model description:** This is a fine-tuned version of the GPT-2 model trained on a dataset of 350+ publisher domains Prebid config files. The model is designed to provide insights into how other publishers configure their Prebid settings. Given a Prebid config setting, like ***bidderTimeout***, it can generate sample Prebid configuration settings based on the collected data. The model aims to help publishers get an idea of how different publishers configure their Prebid settings.
16
 
17
  **Intended uses:** This model is intended to assist publishers in understanding and exploring how other publishers configure their Prebid settings. It can serve as a reference to gain insights into common configurations, best practices, and different approaches used by publishers across various domains.
18
 
@@ -20,9 +20,9 @@ widget:
20
 
21
  **How to use:** To use this model, provide a Prebid config setting. For example: ***bidderSequence***, The model will generate a sample Prebid configuration related to that input based on the collected data.
22
 
23
- **Training data:** This model was trained on a dataset consisting of over 350+ publisher domains Prebid config files. The dataset was collected from a variety of publishers and represents a wide range of Prebid settings used in the industry.
24
 
25
- **Training procedure:** The model was fine-tuned using the GPT-2 base model with the aforementioned dataset. The training loss was 0.45788808012896803.
26
 
27
  **Evaluation results:** The evaluation of this model focuses on its ability to generate coherent and valid Prebid configuration settings based on the provided Prebid config setting. Human evaluators reviewed the generated configurations for relevance and accuracy.
28
 
 
12
 
13
  **Model name:** pbjs_gpt2
14
 
15
+ **Model description:** This is a fine-tuned version of the GPT-2 model trained on a dataset of 1100+ publisher domains Prebid config files. The model is designed to provide insights into how other publishers configure their Prebid settings. Given a Prebid config setting, like ***bidderTimeout***, it can generate sample Prebid configuration settings based on the collected data. The model aims to help publishers get an idea of how different publishers configure their Prebid settings.
16
 
17
  **Intended uses:** This model is intended to assist publishers in understanding and exploring how other publishers configure their Prebid settings. It can serve as a reference to gain insights into common configurations, best practices, and different approaches used by publishers across various domains.
18
 
 
20
 
21
  **How to use:** To use this model, provide a Prebid config setting. For example: ***bidderSequence***, The model will generate a sample Prebid configuration related to that input based on the collected data.
22
 
23
+ **Training data:** This model was trained on a dataset consisting of over 1100+ publisher domains Prebid config files. The dataset was collected from a variety of publishers and represents a wide range of Prebid settings used in the industry.
24
 
25
+ **Training procedure:** The model was fine-tuned using the GPT-2 base model with the aforementioned dataset. The training loss was 0.43277667846199475.
26
 
27
  **Evaluation results:** The evaluation of this model focuses on its ability to generate coherent and valid Prebid configuration settings based on the provided Prebid config setting. Human evaluators reviewed the generated configurations for relevance and accuracy.
28