yjernite HF staff commited on
Commit
b8a5c74
1 Parent(s): bc44730

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +1 -1
app.py CHANGED
@@ -164,7 +164,7 @@ with gr.Blocks(title="Skin Tone and Gender bias in Text-to-Image Generation Mode
164
  In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender and skin tone of the generated subjects. Here's how the analysis works:
165
 
166
  1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
167
- 2. **Gender Detection**: The [BLIP caption generator](https://huggingface.co/Salesforce/blip-image-captioning-large) is used to detect gender by identifying words like "man," "boy," "woman," and "girl" in the captions.
168
  3. **Skin Tone Classification**: The [skin-tone-classifier library](https://github.com/ChenglongMa/SkinToneClassifier) is used to extract the skin tones of the generated subjects.
169
 
170
 
 
164
  In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender and skin tone of the generated subjects. Here's how the analysis works:
165
 
166
  1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
167
+ 2. **Gender Detection**: The [BLIP caption generator](https://huggingface.co/Salesforce/blip-image-captioning-large) is used to elicit gender markers by identifying words like "man," "boy," "woman," and "girl" in the captions.
168
  3. **Skin Tone Classification**: The [skin-tone-classifier library](https://github.com/ChenglongMa/SkinToneClassifier) is used to extract the skin tones of the generated subjects.
169
 
170