Spaces:
Runtime error
Runtime error
File size: 33,967 Bytes
7a1c9b8 ab2028a a3ff20d b2112fe a09b6d4 a2cc35d 7a1263d 4b1a510 162533b e4660a4 32e0205 21b79b9 a9d577e e4660a4 4b1a510 f25e6fe 4b1a510 e4660a4 06a38bb a3ff20d 8001ff3 df15024 8001ff3 a09b6d4 ff28ea7 06a38bb ff28ea7 325db77 54e43c1 325db77 c01a0bb 325db77 e37ebe4 f31313c e37ebe4 f31313c e37ebe4 4b1a510 0875966 f98abe0 8bb3175 7a50d25 e74a6d5 7a50d25 dec3ffb 7a50d25 dec3ffb 8bb3175 7a50d25 a3ff20d f98abe0 ca883bf f98abe0 662683c a3ff20d f98abe0 a2ef30f f49c4dc d4b4467 0875966 d4b4467 0cd86bb 26a4281 e71a574 3c12ec8 92bdd1d f1d9431 85cb98f 3f84de1 3c12ec8 325db77 af2907b 3c12ec8 8750972 868bf16 8750972 3c12ec8 af2907b 030d65b af2907b 8750972 af2907b 8750972 56ff123 8750972 c01a0bb 04d8b4b 37037cb 04d8b4b c83b0ce 7b37222 04d8b4b d4b4467 eaa4429 85cb98f e37ebe4 85cb98f e37ebe4 85cb98f a160077 85cb98f 868bf16 85cb98f 030d65b e37ebe4 85cb98f a160077 e37ebe4 85cb98f 162533b e054405 162533b 04646bb 162533b e054405 162533b c365105 f88f50a 162533b 85cb98f e37ebe4 85cb98f e37ebe4 85cb98f e37ebe4 85cb98f 4a63b31 eaa4429 d4b4467 4a63b31 06a38bb ff28ea7 4a63b31 ff28ea7 4a63b31 ff28ea7 06a38bb 076ea21 55b6a72 ff28ea7 4a63b31 076ea21 8e5029d 5b25129 8e5029d 23028aa 8e5029d 5b25129 23028aa 8e5029d ad37d09 8e5029d 9cefac6 23028aa 2f138d4 ad37d09 32e6730 01c2e34 32e6730 01c2e34 2c5680a 01c2e34 2c5680a 01c2e34 32e6730 01c2e34 55b6a72 01c2e34 32e6730 2f138d4 eaa4429 f49c4dc 714d3fb a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 2bf93a0 a162b85 180927e f31313c 180927e a162b85 da4372a f49c4dc |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 |
import os
import streamlit as st
import json
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import matplotlib.animation as animation
import time
from PIL import Image
from streamlit_image_comparison import image_comparison
import numpy as np
import re
import cohere
#import chromadb
from textwrap import dedent
import google.generativeai as genai
api_key = os.environ["OPENAI_API_KEY"]
from openai import OpenAI
# Initialize OpenAI client and create embeddings
oai_client = OpenAI()
import numpy as np
# Assuming chromadb and TruLens are correctly installed and configured
#from chromadb.utils.embedding_functions import
# Google Langchain
from langchain_google_genai import GoogleGenerativeAI
#Crew imports
from crewai import Agent, Task, Crew, Process
# Retrieve API Key from Environment Variable
GOOGLE_AI_STUDIO = os.environ.get('GOOGLE_API_KEY')
# Ensure the API key is available
if not GOOGLE_AI_STUDIO:
raise ValueError("API key not found. Please set the GOOGLE_AI_STUDIO2 environment variable.")
# Set gemini_llm
gemini_llm = GoogleGenerativeAI(model="gemini-pro", google_api_key=GOOGLE_AI_STUDIO)
# CrewAI ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# Tool import
from crewai.tools.gemini_tools import GeminiSearchTools
from crewai import Agent, Task, Crew, Process
def crewai_process_gemini(research_topic):
# Define your agents with roles and goals
GeminiAgent = Agent(
role='Story Writer',
goal='To create a story from bullet points.',
backstory="""You are an expert writer that understands how to make the average extraordinary on paper """,
verbose=True,
allow_delegation=False,
llm = gemini_llm,
tools=[
GeminiSearchTools.gemini_search
]
)
# Create tasks for your agents
task1 = Task(
description=f"""From {research_topic} create your story by writing at least one sentence about each bullet point
and make sure you have a transitional statement between scenes . BE VERBOSE.""",
agent=GeminiAgent
)
# Instantiate your crew with a sequential process
crew = Crew(
agents=[GeminiAgent],
tasks=[task1],
verbose=2,
process=Process.sequential
)
# Get your crew to work!
result = crew.kickoff()
return result
st.set_page_config(layout="wide")
# Animation Code +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# HIN Number +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
from SPARQLWrapper import SPARQLWrapper, JSON
from streamlit_agraph import agraph, TripleStore, Node, Edge, Config
import json
# Function to load JSON data
def load_data(filename):
with open(filename, 'r') as file:
data = json.load(file)
return data
# Dictionary for color codes
color_codes = {
"residential": "#ADD8E6",
"commercial": "#90EE90",
"community_facilities": "#FFFF00",
"school": "#FFFF00",
"healthcare_facility": "#FFFF00",
"green_space": "#90EE90",
"utility_infrastructure": "#90EE90",
"emergency_services": "#FF0000",
"cultural_facilities": "#D8BFD8",
"recreational_facilities": "#D8BFD8",
"innovation_center": "#90EE90",
"elderly_care_home": "#FFFF00",
"childcare_centers": "#FFFF00",
"places_of_worship": "#D8BFD8",
"event_spaces": "#D8BFD8",
"guest_housing": "#FFA500",
"pet_care_facilities": "#FFA500",
"public_sanitation_facilities": "#A0A0A0",
"environmental_monitoring_stations": "#90EE90",
"disaster_preparedness_center": "#A0A0A0",
"outdoor_community_spaces": "#90EE90",
# Add other types with their corresponding colors
}
#text +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
query = """ ***Introduction***
On his first day at Quantum Data Institute in Green Open Data City Aya, Elian marveled at the city’s harmonious blend of technology and nature.
Guided to his mentor, Dr. Maya Lior, a pioneer in urban data ecosystems, their discussion quickly centered on Aya’s innovative design.
Dr. Lior explained data analytics and green technologies were intricately woven into the city's infrastructure, and how they used
a Custom GPT called Green Data City to create the design.
To interact with the Custon GPT Green Data City design tool click the button below. Additionally, to see how it was built
toggle the Explanation of Custom GPT "Create Green Data City" button.
"""
query2 = """ ***Global Citizen***
Elian and Dr. Maya Lior's journey to the Cultural Center, a beacon of sustainability and technological integration.
Equipped with cutting-edge environmental monitoring sensors, occupancy detectors, and smart lighting systems,
the center is a hub for innovation in resource management and climate action. There, they were greeted by Mohammad,
a dedicated environmental scientist who, despite the language barrier, shared their passion for creating a sustainable future.
Utilizing the Cohere translator, they engaged in a profound dialogue, seamlessly bridging the gap between languages.
Their conversation, rich with ideas and insights on global citizenship and collaborative efforts to tackle climate change
and resource scarcity, underscored the imperative of unity and innovation in facing the challenges of our time.
This meeting, a melting pot of cultures and disciplines, symbolized the global commitment required to sustain our planet.
As Elian is using the Cohere translator, he wonders how to best utilize it efficiently. He studies a Custom GPT called
Conversation Analyzer. It translates a small portion of the message you're sending so you can be comfortable that the
essence of what you are saying is being sent and aids in learning the language. Its mantra is "language is not taught but caught."
To try out the Custom GPT Conversation Analyzer, click the button below. Additionally, to see how it was built, toggle the
Explanation of Custom GPT "Conversation Analyzer" button.
"""
query3 = """ ***Incentive Program***
Elian and Mohammad transition from their meeting with Dr. Lior to explore the Innovation Center, a nexus of high-speed internet,
energy monitoring, and smart security. Mohammad showcases a digital map titled "Create a Green Data City," accessible to all for
enhancing sustainability through a citizen-incentivized program. This map allows users to select locations, revealing graphs of
active sensors and collected data, ensuring transparency and promoting an informed, engaged community. This feature not only
cultivates trust but also encourages participation in optimizing the city's sensor network, addressing the exponential challenge
of data management. Through this collaborative venture, the city embodies a sustainable future, marrying technology with collective
action and environmental stewardship in a single, cohesive narrative.
I use it all the time Mohammad says, I even bought my breakfast this morning from the free meal incentive.
"""
# Function to draw the grid with optional highlighting
def draw_grid(data, highlight_coords=None):
fig, ax = plt.subplots(figsize=(12, 12))
nrows, ncols = data['size']['rows'], data['size']['columns']
ax.set_xlim(0, ncols)
ax.set_ylim(0, nrows)
ax.set_xticks(range(ncols+1))
ax.set_yticks(range(nrows+1))
ax.grid(True)
# Draw roads with a specified grey color
road_color = "#606060" # Light grey; change to "#505050" for dark grey
for road in data.get('roads', []): # Check for roads in the data
start, end = road['start'], road['end']
# Determine if the road is vertical or horizontal based on start and end coordinates
if start[0] == end[0]: # Vertical road
for y in range(min(start[1], end[1]), max(start[1], end[1]) + 1):
ax.add_patch(plt.Rectangle((start[0], nrows-y-1), 1, 1, color=road['color']))
else: # Horizontal road
for x in range(min(start[0], end[0]), max(start[0], end[0]) + 1):
ax.add_patch(plt.Rectangle((x, nrows-start[1]-1), 1, 1, color=road['color']))
# Draw buildings
for building in data['buildings']:
coords = building['coords']
b_type = building['type']
size = building['size']
color = color_codes.get(b_type, '#FFFFFF') # Default color is white if not specified
if highlight_coords and (coords[0], coords[1]) == tuple(highlight_coords):
highlighted_color = "#FFD700" # Gold for highlighting
ax.add_patch(plt.Rectangle((coords[1], nrows-coords[0]-size), size, size, color=highlighted_color, edgecolor='black', linewidth=2))
else:
ax.add_patch(plt.Rectangle((coords[1], nrows-coords[0]-size), size, size, color=color, edgecolor='black', linewidth=1))
ax.text(coords[1]+0.5*size, nrows-coords[0]-0.5*size, b_type, ha='center', va='center', fontsize=8, color='black')
ax.set_xlabel('Columns')
ax.set_ylabel('Rows')
ax.set_title('Village Layout with Color Coding')
return fig
# Title ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# Tabs +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# Create the main app with three tabs
tab1, tab2, tab3, tab4 = st.tabs(["Introduction","Global Citizen", "Incentive Program", "Control Room"])
with tab1:
st.header("A day in the Life of Aya Green Data City")
# Creating columns for the layout
col1, col2 = st.columns([1, 2])
# Displaying the image in the left column
with col1:
image = Image.open('./data/intro_image.jpg')
st.image(image, caption='Aya Green Data City')
# Displaying the text above on the right
with col2:
st.markdown(query)
# Displaying the audio player below the text
voice_option = st.selectbox(
'Choose a voice:',
['alloy', 'echo', 'fable', 'onyx', 'nova', 'shimmer'], key='key1'
)
if st.button('Convert to Speech', key='key3'):
if query:
try:
response = oai_client.audio.speech.create(
model="tts-1",
voice=voice_option,
input=query,
)
# Stream or save the response as needed
# For demonstration, let's assume we save then provide a link for downloading
audio_file_path = "output.mp3"
response.stream_to_file(audio_file_path)
# Display audio file to download
st.audio(audio_file_path, format='audio/mp3')
st.success("Conversion successful!")
except Exception as e:
st.error(f"An error occurred: {e}")
else:
st.error("Please enter some text to convert.")
st.header("Custom GPT Engineering Tools")
st.link_button("Custom GPT Green Data City Creation Tool (Population 10,000 to 50,000)", "https://chat.openai.com/g/g-4bPJUaHS8-create-a-green-data-village")
if st.button('Show/Hide Explanation of "Custom GPT Create Green Data City"'):
# Toggle visibility
st.session_state.show_instructions = not st.session_state.get('show_instructions', False)
# Check if the instructions should be shown
if st.session_state.get('show_instructions', False):
st.write("""
On clicking "Create Data Village" create a Green Data Village following the 5 Steps below. Output a JSON file similar to the Example by completing the five Steps.
To generate the provided JSON code, I would instruct a custom GPT to create a detailed description of a hypothetical smart city layout, named "Green Smart Village", starting with a population of 10,000 designed to grow to 50,000. This layout should include a grid size of 21x21, a list of buildings and roads, each with specific attributes:
**Step 1:** General Instructions:
Generate a smart city layout for "Green Smart Village" with a 21x21 grid. Include a population of 10,000 designed to grow to 50,000.
**Step 2:** Buildings:
For each building, specify its coordinates on the grid, type (e.g., residential, commercial, healthcare facility), size (in terms of the grid), color, and equipped sensors (e.g., smart meters, water flow sensors).
Types of buildings should vary and include residential, commercial, community facilities, school, healthcare facility, green space, utility infrastructure, emergency services, cultural facilities, recreational facilities, innovation center, elderly care home, childcare centers, places of worship, event spaces, guest housing, pet care facilities, public sanitation facilities, environmental monitoring stations, disaster preparedness center, outdoor community spaces, typical road, and typical road crossing.
**Step 3:** Assign each building unique sensors based on its type, ensuring a mix of technology like smart meters, occupancy sensors, smart lighting systems, and environmental monitoring sensors.
**Step 4:** Roads:
Detail the roads' start and end coordinates, color, and sensors installed.
Ensure roads connect significant areas of the city, providing access to all buildings. Equip roads with sensors for traffic flow, smart streetlights, and pollution monitoring. MAKE SURE ALL BUILDINGS HAVE ACCESS TO A ROAD.
This test scenario would evaluate the model's ability to creatively assemble a smart city plan with diverse infrastructure and technology implementations, reflecting real-world urban planning challenges and the integration of smart technologies for sustainable and efficient city management.
Example:
{
"city": "City Name",
"population": "Population Size",
"size": {
"rows": "Number of Rows",
"columns": "Number of Columns"
},
"buildings": [
{
"coords": ["X", "Y"],
"type": "Building Type",
"size": "Building Size",
"color": "Building Color",
"sensors": ["Sensor Types"]
}
],
"roads": [
{
"start": ["X Start", "Y Start"],
"end": ["X End", "Y End"],
"color": "Road Color",
"sensors": ["Sensor Types"]
}
]
}
**Step 5:** Finally create a Dalle image FOR EACH BUILDING in the JSON file depicting what a user will experience there in this green open data city including sensors. LABEL EACH IMAGE.
""")
with tab2:
st.header("Becoming a Global Citizen")
# Creating columns for the layout
col1, col2 = st.columns([1, 2])
# Displaying the image in the left column
with col1:
image = Image.open('./data/global_image.jpg')
st.image(image, caption='Cultural Center Cohere Translator')
# Displaying the text above on the right
with col2:
st.markdown(query2)
# Displaying the audio player below the text
voice_option2 = st.selectbox(
'Choose a voice:',
['alloy', 'echo', 'fable', 'onyx', 'nova', 'shimmer'],key='key2'
)
if st.button('Convert to Speech', key='key4'):
if query2:
try:
response = oai_client.audio.speech.create(
model="tts-1",
voice=voice_option2,
input=query2,
)
# Stream or save the response as needed
# For demonstration, let's assume we save then provide a link for downloading
audio_file_path = "output.mp3"
response.stream_to_file(audio_file_path)
# Display audio file to download
st.audio(audio_file_path, format='audio/mp3')
st.success("Conversion successful!")
except Exception as e:
st.error(f"An error occurred: {e}")
else:
st.error("Please enter some text to convert.")
cohere_api_key = os.environ.get('COHERE_API_KEY') # Fetch the API key from environment variable
if cohere_api_key is None:
st.error("API key not found. Please set the COHERE_API_KEY environment variable.")
st.stop()
# Get API Key Here - https://dashboard.cohere.com/api-keys
co = cohere.Client(cohere_api_key) # Use the fetched API key
def generate_text(prompt, model='c4ai-aya', max_tokens=300, temperature=0.4):
response = co.generate(
model=model,
prompt=prompt,
max_tokens=max_tokens,
temperature=temperature,
k=0,
stop_sequences=[],
return_likelihoods='NONE')
return response.generations[0].text
# Streamlit interface
st.title("Cohere Translator")
lang_id = {
"Afrikaans": "af",
"Amharic": "am",
"Arabic": "ar",
"Asturian": "ast",
"Azerbaijani": "az",
"Bashkir": "ba",
"Belarusian": "be",
"Bulgarian": "bg",
"Bengali": "bn",
"Breton": "br",
"Bosnian": "bs",
"Catalan": "ca",
"Cebuano": "ceb",
"Czech": "cs",
"Welsh": "cy",
"Danish": "da",
"German": "de",
"Greeek": "el",
"English": "en",
"Spanish": "es",
"Estonian": "et",
"Persian": "fa",
"Fulah": "ff",
"Finnish": "fi",
"French": "fr",
"Western Frisian": "fy",
"Irish": "ga",
"Gaelic": "gd",
"Galician": "gl",
"Gujarati": "gu",
"Hausa": "ha",
"Hebrew": "he",
"Hindi": "hi",
"Croatian": "hr",
"Haitian": "ht",
"Hungarian": "hu",
"Armenian": "hy",
"Indonesian": "id",
"Igbo": "ig",
"Iloko": "ilo",
"Icelandic": "is",
"Italian": "it",
"Japanese": "ja",
"Javanese": "jv",
"Georgian": "ka",
"Kazakh": "kk",
"Central Khmer": "km",
"Kannada": "kn",
"Korean": "ko",
"Luxembourgish": "lb",
"Ganda": "lg",
"Lingala": "ln",
"Lao": "lo",
"Lithuanian": "lt",
"Latvian": "lv",
"Malagasy": "mg",
"Macedonian": "mk",
"Malayalam": "ml",
"Mongolian": "mn",
"Marathi": "mr",
"Malay": "ms",
"Burmese": "my",
"Nepali": "ne",
"Dutch": "nl",
"Norwegian": "no",
"Northern Sotho": "ns",
"Occitan": "oc",
"Oriya": "or",
"Panjabi": "pa",
"Polish": "pl",
"Pushto": "ps",
"Portuguese": "pt",
"Romanian": "ro",
"Russian": "ru",
"Sindhi": "sd",
"Sinhala": "si",
"Slovak": "sk",
"Slovenian": "sl",
"Somali": "so",
"Albanian": "sq",
"Serbian": "sr",
"Swati": "ss",
"Sundanese": "su",
"Swedish": "sv",
"Swahili": "sw",
"Tamil": "ta",
"Thai": "th",
"Tagalog": "tl",
"Tswana": "tn",
"Turkish": "tr",
"Ukrainian": "uk",
"Urdu": "ur",
"Uzbek": "uz",
"Vietnamese": "vi",
"Wolof": "wo",
"Xhosa": "xh",
"Yiddish": "yi",
"Yoruba": "yo",
"Chinese": "zh",
"Zulu": "zu",
}
# Text input
user_input = st.text_area("Enter your text", " Hi Mohammed, it is nice to meet you. Let's discuss how to be better friends and tackle the world's issues such as global warming and climate change together. There are already so many technology solutions in this grand city that can be applied to the world. I am glad to be working on this problem with you.")
"""مرحباً Mohammed، سررت بلقائك. دعونا نتحدث عن كيفية أن نكون أصدقاء أفضل ونقوم بمعالجة قضايا العالم مثل الاحتباس الحراري وتغير المناخ معاً. هناك العديد من الحلول التكنولوجية في هذه المدينة الكبيرة التي يمكن تطبيقها على العالم. أنا سعيد للعمل على هذه المشكلة معك.
"""
# Language selection - for demonstration purposes only
# In a real translation scenario, you'd use actual language codes and a translation model
# source_lang = st.selectbox(label="Source language", options=list(lang_id.keys()))
# target_lang = st.selectbox(label="Target language", options=list(lang_id.keys()))
# Language selection with default values
source_lang = st.selectbox(label="Source language", options=list(lang_id.keys()), index=list(lang_id.values()).index('en')) # Default to English
target_lang = st.selectbox(label="Target language", options=list(lang_id.keys()), index=list(lang_id.values()).index('ar')) # Default to Arabic
# Button to generate text
if st.button("Translate"):
prompt = f"Translate the following {source_lang} text to {target_lang}: " + user_input + " ONLY TRANSLATE DON'T ADD ANY ADDITIONAL DETAILS"
# Generate text
output = generate_text(prompt)
st.text_area("Generated Text", output, height=300)
st.header("Custom GPT Engineering Tools")
st.link_button("Conversation Analyzer", "https://chat.openai.com/g/g-XARuyBgpL-conversation-analyzer")
if st.button('Show/Hide Explanation of "Conversation Analyzer"'):
# Toggle visibility
st.session_state.show_instructions = not st.session_state.get('show_instructions', False)
# Check if the instructions should be shown
if st.session_state.get('show_instructions', False):
st.write("""
Upon click "Input Your Conversation" complete the following 8 steps
1. Input Acquisition: Ask the user to input the text they would like analyzed.
2. Key Word Identification: Analyze the text and advise the user on the number of words they would need in order to ensure the purpose of the text is conveyed. This involves processing the text using natural language processing (NLP) techniques to detect words that are crucial to understanding the essence of the conversation. FRIST give the number of words needed and SECOND the words in a bulleted list
3. Ask the user if they would like to use your number of words or reduce to a smaller optimized list designed to convey the most accurate amount of information possible given the reduced set.
4. Ask the user what language they would like to translate the input into.
5. For the newly optimized list of words give the translated words FIRST and the original "language of the input" SECOND . Don't give the definition of the word.
6. Show the translated input and highlight the keywords by bolding them.
7. Give a distinct 100x100 image of each keyword. Try to put them in a single image so they can be cropped out when needed.
8 Allow the user to provide feedback on the analysis and the outputs, allowing for additional or reduction of words.
9. Give the final translation with highlighted words and provide an efficiency score. Number of words chosen versus suggested words x 100
""")
# tab3 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
with tab3:
st.header("City Layout & Sensor Graph")
# Divide the page into three columns
col1, col2 = st.columns(2)
with col1:
data = load_data('grid.json') # Ensure this path is correct
# Dropdown for selecting a building
building_options = [f"{bld['type']} at ({bld['coords'][0]}, {bld['coords'][1]})" for bld in data['buildings']]
selected_building = st.selectbox("Select a building to highlight:", options=building_options)
selected_index = building_options.index(selected_building)
selected_building_coords = data['buildings'][selected_index]['coords']
# Draw the grid with the selected building highlighted
fig = draw_grid(data, highlight_coords=selected_building_coords)
st.pyplot(fig)
# Assuming sensors are defined in your data, display them
sensors = data['buildings'][selected_index].get('sensors', [])
st.write(f"Sensors in selected building: {', '.join(sensors)}")
with col2:
st.header("Sensor Graph")
if sensors: # Check if there are sensors to display
graph_store = TripleStore()
building_name = f"{data['buildings'][selected_index]['type']} ({selected_building_coords[0]}, {selected_building_coords[1]})"
# Iterate through each sensor and create a triple linking it to the building
for sensor in sensors:
sensor_id = f"Sensor: {sensor}" # Label for sensor nodes
# Correctly add the triple without named arguments
graph_store.add_triple(building_name, "has_sensor", sensor_id)
# Configuration for the graph visualization
agraph_config = Config(height=500, width=500, nodeHighlightBehavior=True, highlightColor="#F7A7A6", directed=True, collapsible=True)
# Display the graph
agraph(nodes=graph_store.getNodes(), edges=graph_store.getEdges(), config=agraph_config)
st.markdown("<hr/>", unsafe_allow_html=True)
col3, col4 = st.columns(2)
with col3:
st.markdown(query3)
# Displaying the audio player below the text
voice_option3 = st.selectbox(
'Choose a voice:',
['alloy', 'echo', 'fable', 'onyx', 'nova', 'shimmer'],key='key5'
)
if st.button('Convert to Speech', key='key6'):
if query2:
try:
response = oai_client.audio.speech.create(
model="tts-1",
voice=voice_option3,
input=query3,
)
# Stream or save the response as needed
# For demonstration, let's assume we save then provide a link for downloading
audio_file_path = "output.mp3"
response.stream_to_file(audio_file_path)
# Display audio file to download
st.audio(audio_file_path, format='audio/mp3')
st.success("Conversion successful!")
except Exception as e:
st.error(f"An error occurred: {e}")
else:
st.error("Please enter some text to convert.")
with col4:
image = Image.open('./data/incentive_image.jpg')
st.image(image, caption='Sensor Insentive Program')
with tab4:
st.header("Control Room")
st.write("Synthetic data should be used to drive control room")
"""
Smart meters
Water flow sensors
Temperature and humidity sensors
Occupancy sensors
HVAC control systems
Smart lighting
Security cameras
Indoor air quality sensors
Smart lighting systems
Energy consumption monitors
Patient monitoring systems
Environmental monitoring sensors
Energy management systems
Soil moisture sensors
Smart irrigation systems
Leak detection sensors
Grid monitoring sensors
GPS tracking for vehicles
Smart building sensors
Dispatch management systems
High-speed internet connectivity
Energy consumption monitoring
Smart security systems
Environmental control systems
Security systems
Smart HVAC systems
Smart locks
Water usage monitoring
Smart inventory management systems
Waste level sensors
Fleet management systems for sanitation vehicles
Air quality sensors
Weather stations
Pollution monitors
Early warning systems
Communication networks
Adaptive lighting systems
Traffic flow sensors
Smart streetlights
Residential Building - Light Blue, with smart meters, water flow sensors, and temperature and humidity sensors.
Commercial Building - Green, equipped with occupancy sensors, smart meters, and HVAC control systems.
Community Facilities - Yellow, featuring smart lighting, security cameras, and occupancy sensors.
School - Yellow, with indoor air quality sensors, smart lighting systems, and energy consumption monitors.
Healthcare Facility - Yellow, having patient monitoring systems, environmental monitoring sensors, and energy management systems.
Green Space - Dark Green, with soil moisture sensors, smart irrigation systems, and environmental monitoring sensors.
Utility Infrastructure - Dark Green, including smart meters, leak detection sensors, and grid monitoring sensors.
Emergency Services - Red, with GPS tracking for vehicles, smart building sensors, and dispatch management systems.
Cultural Facilities - Purple, equipped with environmental monitoring sensors, occupancy sensors, and smart lighting.
Recreational Facilities - Purple, featuring air quality sensors, smart equipment maintenance sensors, and energy management systems.
Innovation Center - Green, with high-speed internet connectivity, energy consumption monitoring, and smart security systems.
Elderly Care Home - Yellow, including patient monitoring sensors, environmental control systems, and security systems.
Childcare Centers - Yellow, with indoor air quality sensors, security cameras, and occupancy sensors.
Places of Worship - Purple, featuring smart lighting, energy consumption monitoring, and security cameras.
Event Spaces - Purple, with smart HVAC systems, occupancy sensors, and smart lighting.
Guest Housing - Orange, including smart locks, energy management systems, and water usage monitoring.
Pet Care Facilities - Orange, equipped with environmental monitoring sensors, security systems, and smart inventory management systems.
Public Sanitation Facilities - Grey, with waste level sensors, fleet management systems for sanitation vehicles, and air quality sensors.
Environmental Monitoring Stations - Dark Green, featuring air quality sensors, weather stations, and pollution monitors.
Disaster Preparedness Center - Grey, including early warning systems, communication networks, and environmental sensors.
Outdoor Community Spaces - Dark Green, with environmental sensors, smart irrigation systems, and adaptive lighting systems.
Typical Road - Dark Grey, equipped with traffic flow sensors, smart streetlights, and pollution monitoring sensors.
Typical Road Crossing - Dark Grey, featuring traffic flow sensors, smart streetlights, and pollution monitoring sensors.
Hi Mohammed, it is nice to meet you. Let's discuss how to be better friends and tackle the world's issues such as global warming and climate change together. There are already so many technology solutions in this grand city that can be applied to the world. I am glad to be working on this problem with you.
مرحباً Mohammed، سررت بلقائك. دعونا نتحدث عن كيفية أن نكون أصدقاء أفضل ونقوم بمعالجة قضايا العالم مثل الاحتباس الحراري وتغير المناخ معاً. هناك العديد من الحلول التكنولوجية في هذه المدينة الكبيرة التي يمكن تطبيقها على العالم. أنا سعيد للعمل على هذه المشكلة معك.
"""
|