Avijit Ghosh commited on
Commit
7e38241
1 Parent(s): ebac435

Reorder models

Browse files
Files changed (1) hide show
  1. app.py +20 -21
app.py CHANGED
@@ -144,6 +144,26 @@ def generate_images_plots(prompt, model_name):
144
 
145
  with gr.Blocks(title="Skin Tone and Gender bias in Text to Image Models") as demo:
146
  gr.Markdown("# Skin Tone and Gender bias in Text to Image Models")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
147
  model_dropdown = gr.Dropdown(
148
  label="Choose a model",
149
  choices=[
@@ -171,25 +191,4 @@ with gr.Blocks(title="Skin Tone and Gender bias in Text to Image Models") as dem
171
  genplot = gr.Plot(label="Gender")
172
  btn.click(generate_images_plots, inputs=[prompt, model_dropdown], outputs=[gallery, skinplot, genplot])
173
 
174
- gr.Markdown('''
175
- In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender and skin tone of the generated subjects. Here's how the analysis works:
176
-
177
- 1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
178
- 2. **Gender Detection**: The BLIP caption generator is used to detect gender by identifying words like "man," "boy," "woman," and "girl" in the captions.
179
- 3. **Skin Tone Classification**: The skin-tone-classifier library is used to extract the skin tones of the generated subjects.
180
-
181
-
182
- #### Visualization
183
-
184
- We create visual grids to represent the data:
185
-
186
- - **Skin Tone Grids**: Skin tones are plotted as exact hex codes rather than using the Fitzpatrick scale, which can be problematic and limiting for darker skin tones.
187
- - **Gender Grids**: Light green denotes men, dark green denotes women, and grey denotes cases where the BLIP caption did not specify a binary gender.
188
-
189
- ---
190
-
191
- This demo provides an insightful look into how current text-to-image models handle sensitive attributes, shedding light on areas for improvement and further study.
192
- [Here is an article](https://medium.com/@evijit/analysis-of-ai-generated-images-of-indian-people-for-colorism-and-sexism-b80ff946759f) showing how this space can be used to perform such analyses, using colorism and sexism in India as an example.
193
- ''')
194
-
195
  demo.launch(debug=True)
 
144
 
145
  with gr.Blocks(title="Skin Tone and Gender bias in Text to Image Models") as demo:
146
  gr.Markdown("# Skin Tone and Gender bias in Text to Image Models")
147
+ gr.Markdown('''
148
+ In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender and skin tone of the generated subjects. Here's how the analysis works:
149
+
150
+ 1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
151
+ 2. **Gender Detection**: The BLIP caption generator is used to detect gender by identifying words like "man," "boy," "woman," and "girl" in the captions.
152
+ 3. **Skin Tone Classification**: The skin-tone-classifier library is used to extract the skin tones of the generated subjects.
153
+
154
+
155
+ #### Visualization
156
+
157
+ We create visual grids to represent the data:
158
+
159
+ - **Skin Tone Grids**: Skin tones are plotted as exact hex codes rather than using the Fitzpatrick scale, which can be problematic and limiting for darker skin tones.
160
+ - **Gender Grids**: Light green denotes men, dark green denotes women, and grey denotes cases where the BLIP caption did not specify a binary gender.
161
+
162
+ ---
163
+
164
+ This demo provides an insightful look into how current text-to-image models handle sensitive attributes, shedding light on areas for improvement and further study.
165
+ [Here is an article](https://medium.com/@evijit/analysis-of-ai-generated-images-of-indian-people-for-colorism-and-sexism-b80ff946759f) showing how this space can be used to perform such analyses, using colorism and sexism in India as an example.
166
+ ''')
167
  model_dropdown = gr.Dropdown(
168
  label="Choose a model",
169
  choices=[
 
191
  genplot = gr.Plot(label="Gender")
192
  btn.click(generate_images_plots, inputs=[prompt, model_dropdown], outputs=[gallery, skinplot, genplot])
193
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
194
  demo.launch(debug=True)