Spaces:
Runtime error
Runtime error
Update app.py
Browse files
app.py
CHANGED
@@ -207,7 +207,7 @@ with tab1:
|
|
207 |
# Displaying the image in the left column
|
208 |
with col1:
|
209 |
image = Image.open('./data/intro_image.jpg')
|
210 |
-
st.image(image, caption='Green
|
211 |
|
212 |
# Displaying the text above on the right
|
213 |
with col2:
|
@@ -313,7 +313,120 @@ with tab1:
|
|
313 |
|
314 |
with tab2:
|
315 |
|
316 |
-
st.header("
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
317 |
|
318 |
with tab3:
|
319 |
|
|
|
207 |
# Displaying the image in the left column
|
208 |
with col1:
|
209 |
image = Image.open('./data/intro_image.jpg')
|
210 |
+
st.image(image, caption='Aya Green Data City')
|
211 |
|
212 |
# Displaying the text above on the right
|
213 |
with col2:
|
|
|
313 |
|
314 |
with tab2:
|
315 |
|
316 |
+
st.header("Becoming a Global Citizen")
|
317 |
+
|
318 |
+
# Creating columns for the layout
|
319 |
+
col1, col2 = st.columns([1, 2])
|
320 |
+
|
321 |
+
# Displaying the image in the left column
|
322 |
+
with col1:
|
323 |
+
image = Image.open('./data/intro_image.jpg')
|
324 |
+
st.image(image, caption='Aya Green Data City')
|
325 |
+
|
326 |
+
# Displaying the text above on the right
|
327 |
+
with col2:
|
328 |
+
|
329 |
+
st.markdown(query)
|
330 |
+
|
331 |
+
# Displaying the audio player below the text
|
332 |
+
voice_option = st.selectbox(
|
333 |
+
'Choose a voice:',
|
334 |
+
['alloy', 'echo', 'fable', 'onyx', 'nova', 'shimmer']
|
335 |
+
)
|
336 |
+
|
337 |
+
|
338 |
+
if st.button('Convert to Speech'):
|
339 |
+
if query:
|
340 |
+
try:
|
341 |
+
response = oai_client.audio.speech.create(
|
342 |
+
model="tts-1",
|
343 |
+
voice=voice_option,
|
344 |
+
input=query,
|
345 |
+
)
|
346 |
+
|
347 |
+
# Stream or save the response as needed
|
348 |
+
# For demonstration, let's assume we save then provide a link for downloading
|
349 |
+
audio_file_path = "output.mp3"
|
350 |
+
response.stream_to_file(audio_file_path)
|
351 |
+
|
352 |
+
# Display audio file to download
|
353 |
+
st.audio(audio_file_path, format='audio/mp3')
|
354 |
+
st.success("Conversion successful!")
|
355 |
+
|
356 |
+
|
357 |
+
except Exception as e:
|
358 |
+
st.error(f"An error occurred: {e}")
|
359 |
+
else:
|
360 |
+
st.error("Please enter some text to convert.")
|
361 |
+
|
362 |
+
|
363 |
+
|
364 |
+
|
365 |
+
|
366 |
+
|
367 |
+
st.header("Custom GPT Engineering Tools")
|
368 |
+
st.link_button("Custom GPT Green Data City Creation Tool (Population 10,000 to 50,000)", "https://chat.openai.com/g/g-4bPJUaHS8-create-a-green-data-village")
|
369 |
+
|
370 |
+
if st.button('Show/Hide Explanation of "Custom GPT Create Green Data City"'):
|
371 |
+
# Toggle visibility
|
372 |
+
st.session_state.show_instructions = not st.session_state.get('show_instructions', False)
|
373 |
+
|
374 |
+
# Check if the instructions should be shown
|
375 |
+
if st.session_state.get('show_instructions', False):
|
376 |
+
st.write("""
|
377 |
+
On clicking "Create Data Village" create a Green Data Village following the 5 Steps below. Output a JSON file similar to the Example by completing the five Steps.
|
378 |
+
|
379 |
+
To generate the provided JSON code, I would instruct a custom GPT to create a detailed description of a hypothetical smart city layout, named "Green Smart Village", starting with a population of 10,000 designed to grow to 50,000. This layout should include a grid size of 21x21, a list of buildings and roads, each with specific attributes:
|
380 |
+
|
381 |
+
**Step 1:** General Instructions:
|
382 |
+
Generate a smart city layout for "Green Smart Village" with a 21x21 grid. Include a population of 10,000 designed to grow to 50,000.
|
383 |
+
|
384 |
+
**Step 2:** Buildings:
|
385 |
+
For each building, specify its coordinates on the grid, type (e.g., residential, commercial, healthcare facility), size (in terms of the grid), color, and equipped sensors (e.g., smart meters, water flow sensors).
|
386 |
+
Types of buildings should vary and include residential, commercial, community facilities, school, healthcare facility, green space, utility infrastructure, emergency services, cultural facilities, recreational facilities, innovation center, elderly care home, childcare centers, places of worship, event spaces, guest housing, pet care facilities, public sanitation facilities, environmental monitoring stations, disaster preparedness center, outdoor community spaces, typical road, and typical road crossing.
|
387 |
+
|
388 |
+
**Step 3:** Assign each building unique sensors based on its type, ensuring a mix of technology like smart meters, occupancy sensors, smart lighting systems, and environmental monitoring sensors.
|
389 |
+
|
390 |
+
**Step 4:** Roads:
|
391 |
+
Detail the roads' start and end coordinates, color, and sensors installed.
|
392 |
+
Ensure roads connect significant areas of the city, providing access to all buildings. Equip roads with sensors for traffic flow, smart streetlights, and pollution monitoring. MAKE SURE ALL BUILDINGS HAVE ACCESS TO A ROAD.
|
393 |
+
|
394 |
+
This test scenario would evaluate the model's ability to creatively assemble a smart city plan with diverse infrastructure and technology implementations, reflecting real-world urban planning challenges and the integration of smart technologies for sustainable and efficient city management.
|
395 |
+
|
396 |
+
Example:
|
397 |
+
{
|
398 |
+
"city": "City Name",
|
399 |
+
"population": "Population Size",
|
400 |
+
"size": {
|
401 |
+
"rows": "Number of Rows",
|
402 |
+
"columns": "Number of Columns"
|
403 |
+
},
|
404 |
+
"buildings": [
|
405 |
+
{
|
406 |
+
"coords": ["X", "Y"],
|
407 |
+
"type": "Building Type",
|
408 |
+
"size": "Building Size",
|
409 |
+
"color": "Building Color",
|
410 |
+
"sensors": ["Sensor Types"]
|
411 |
+
}
|
412 |
+
],
|
413 |
+
"roads": [
|
414 |
+
{
|
415 |
+
"start": ["X Start", "Y Start"],
|
416 |
+
"end": ["X End", "Y End"],
|
417 |
+
"color": "Road Color",
|
418 |
+
"sensors": ["Sensor Types"]
|
419 |
+
}
|
420 |
+
]
|
421 |
+
}
|
422 |
+
|
423 |
+
**Step 5:** Finally create a Dalle image FOR EACH BUILDING in the JSON file depicting what a user will experience there in this green open data city including sensors. LABEL EACH IMAGE.
|
424 |
+
|
425 |
+
|
426 |
+
""")
|
427 |
+
|
428 |
+
|
429 |
+
|
430 |
|
431 |
with tab3:
|
432 |
|