MatthewMec commited on
Commit
1decd71
·
1 Parent(s): 84e41fe
Files changed (7) hide show
  1. .gitignore +129 -0
  2. Dockerfile +6 -0
  3. README.md +17 -13
  4. challenge.md +98 -0
  5. docker-compose.yaml +12 -0
  6. requirements.txt +8 -0
  7. streamlit_app.py +241 -0
.gitignore ADDED
@@ -0,0 +1,129 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Byte-compiled / optimized / DLL files
2
+ __pycache__/
3
+ *.py[cod]
4
+ *$py.class
5
+
6
+ # C extensions
7
+ *.so
8
+
9
+ # Distribution / packaging
10
+ .Python
11
+ build/
12
+ develop-eggs/
13
+ dist/
14
+ downloads/
15
+ eggs/
16
+ .eggs/
17
+ lib/
18
+ lib64/
19
+ parts/
20
+ sdist/
21
+ var/
22
+ wheels/
23
+ pip-wheel-metadata/
24
+ share/python-wheels/
25
+ *.egg-info/
26
+ .installed.cfg
27
+ *.egg
28
+ MANIFEST
29
+
30
+ # PyInstaller
31
+ # Usually these files are written by a python script from a template
32
+ # before PyInstaller builds the exe, so as to inject date/other infos into it.
33
+ *.manifest
34
+ *.spec
35
+
36
+ # Installer logs
37
+ pip-log.txt
38
+ pip-delete-this-directory.txt
39
+
40
+ # Unit test / coverage reports
41
+ htmlcov/
42
+ .tox/
43
+ .nox/
44
+ .coverage
45
+ .coverage.*
46
+ .cache
47
+ nosetests.xml
48
+ coverage.xml
49
+ *.cover
50
+ *.py,cover
51
+ .hypothesis/
52
+ .pytest_cache/
53
+
54
+ # Translations
55
+ *.mo
56
+ *.pot
57
+
58
+ # Django stuff:
59
+ *.log
60
+ local_settings.py
61
+ db.sqlite3
62
+ db.sqlite3-journal
63
+
64
+ # Flask stuff:
65
+ instance/
66
+ .webassets-cache
67
+
68
+ # Scrapy stuff:
69
+ .scrapy
70
+
71
+ # Sphinx documentation
72
+ docs/_build/
73
+
74
+ # PyBuilder
75
+ target/
76
+
77
+ # Jupyter Notebook
78
+ .ipynb_checkpoints
79
+
80
+ # IPython
81
+ profile_default/
82
+ ipython_config.py
83
+
84
+ # pyenv
85
+ .python-version
86
+
87
+ # pipenv
88
+ # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89
+ # However, in case of collaboration, if having platform-specific dependencies or dependencies
90
+ # having no cross-platform support, pipenv may install dependencies that don't work, or not
91
+ # install all needed dependencies.
92
+ #Pipfile.lock
93
+
94
+ # PEP 582; used by e.g. github.com/David-OConnor/pyflow
95
+ __pypackages__/
96
+
97
+ # Celery stuff
98
+ celerybeat-schedule
99
+ celerybeat.pid
100
+
101
+ # SageMath parsed files
102
+ *.sage.py
103
+
104
+ # Environments
105
+ .env
106
+ .venv
107
+ env/
108
+ venv/
109
+ ENV/
110
+ env.bak/
111
+ venv.bak/
112
+
113
+ # Spyder project settings
114
+ .spyderproject
115
+ .spyproject
116
+
117
+ # Rope project settings
118
+ .ropeproject
119
+
120
+ # mkdocs documentation
121
+ /site
122
+
123
+ # mypy
124
+ .mypy_cache/
125
+ .dmypy.json
126
+ dmypy.json
127
+
128
+ # Pyre type checker
129
+ .pyre/
Dockerfile ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ FROM python:3.9-slim
2
+ EXPOSE 8501
3
+ WORKDIR /app
4
+ COPY . .
5
+ RUN pip3 install -r requirements.txt
6
+ ENTRYPOINT ["streamlit", "run", "streamlit_app.py", "--server.port=8501", "--server.address=0.0.0.0"]
README.md CHANGED
@@ -1,13 +1,17 @@
1
- ---
2
- title: Weather App
3
- emoji: 👀
4
- colorFrom: gray
5
- colorTo: yellow
6
- sdk: streamlit
7
- sdk_version: 1.36.0
8
- app_file: app.py
9
- pinned: false
10
- license: apache-2.0
11
- ---
12
-
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
1
+ # Challenge General Information
2
+
3
+ You can read the details of the challenge at [challenge.md](challenge.md)
4
+
5
+ ## Key Items
6
+
7
+ - __Due Date:__ 7/23/2024
8
+ - __Work Rules:__ You cannot work with others. You can ask any question you want in our general channel. The teacher and TA are the only ones who can answer questions. __You cannot use code from other student's apps.__
9
+ - __Product:__ A streamlit app that runs within Docker, builds from your repo, and is published on Hugging Face in our DS460 Org.
10
+ - __Github Process:__ Each student will fork the challenge repository and create their app. They will submit a link to the Hugging Face app in Canvas.
11
+ - __Canvas Process:__ Each student will upload a `.pdf` or `.html` file with your results as described in [challenge.md](challenge.md)
12
+
13
+
14
+ ## Notes & References
15
+
16
+ - [Fork a repo](https://docs.github.com/en/get-started/quickstart/fork-a-repo)
17
+ - [Creating a pull request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request)
challenge.md ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # API ↦ DATA ↦ DASHBOARD
2
+
3
+ We use APIs to digest streaming data, and this challenge will require you to figure out an API and then build the data from that API into your Dashboard. Our current challenge is the [Open-Meteo API](https://open-meteo.com/). We learned about [Streamlit] (https://streamlit.io/) and [Docker] (https://www.docker.com/) for application development. We will build an app that uses the API to display weather data for decision-making.
4
+
5
+ __NOTE!__
6
+
7
+ They provide python code examples with the API. You will need to tweak yours to use Polars and to not use the Caching. See these two chunks for examples to prompt your coding (note the second one is not a complete code snippet).
8
+
9
+ ```python
10
+ openmeteo = openmeteo_requests.Client()
11
+ ```
12
+
13
+
14
+ ```python
15
+ start = datetime.fromtimestamp(hourly.Time(), timezone.utc)
16
+ end = datetime.fromtimestamp(hourly.TimeEnd(), timezone.utc)
17
+ freq = timedelta(seconds = hourly.Interval())
18
+
19
+ df = pl.select(
20
+ date = pl.datetime_range(start, end, freq, closed = "left"),
21
+ temperature_2m = hourly_temperature_2m)
22
+ ```
23
+
24
+ ## Coding Challenge
25
+
26
+ ### Driving needs
27
+
28
+ _Each of the items below must be addressed by your app._
29
+
30
+ 1. Allow the user to pull 15 days of [historical forecast data](https://open-meteo.com/en/docs/historical-forecast-api#start_date=2024-07-02) and [historical data](https://open-meteo.com/en/docs/historical-weather-api) for the three BYU locations (Idaho, Hawaii, Provo) at an hourly resolution.
31
+ - Make sure that the date and time is understandable by the user.
32
+ - Use display units for the United States of America.
33
+ 2. Display summary tables comparing the forecasts and actual weather for all three cities that lets them compare;
34
+ - The three cities to each other.
35
+ - The forecast to the actual weather at each city.
36
+ 3. Provide multiple visualizations to facilitate these comparisons.
37
+ - Daily highs over the month for each location.
38
+ - A visualization that uses boxplots to show each city's varied hourly temperature readings.
39
+ - A novel visualization of your creation (no bar charts or pie charts)
40
+ 4. Build KPIs into your dashboard.
41
+ - Provide KPIs that show the highest value for the selected variable in that period and the respective city.
42
+ - Provide KPIs that show the lowest value for the selected variable in that period and the respective city.
43
+ 5. Include the following user elements in your dashboard.
44
+ - Allow the user to pick the 15 days they want to compare within the limits of the API.
45
+ - Allow the user user pick the weather variables of interest from at least ten different options of the API.
46
+ - Ask the user for their time zone choice (Hawaii or Mountain). Display all data in that time zone.
47
+ - Add two additional user inputs of your choice (for example; change the graph, additional API inputs, different summaries for your table)
48
+
49
+ ### Data Science Dashboard
50
+
51
+ We will use Streamlit as our prototype dashboard tool, but we need to embed that streamlit app into a Docker container.
52
+
53
+ Within this repository you can simply run `docker compose up` to leverage the `docker-compose.yaml` with your local folder synced with the container folder where the streamlit app is runnning.
54
+
55
+ Additionally, you can use `docker build -t streamlit .` to use the `Dockerfile` to build the image and then use `docker run -p 8501:8501 -v "$(pwd):/app:rw" streamlit` to start the container with the appropriate port and volume settings.
56
+
57
+ ### Repo Structure
58
+
59
+ Your repo should be built so that I can clone the repo and run the Docker command (`docker compose up`) as described in your `readme.md` that allows me to see your app in my web browser without requiring me to install Streamlit on my computer.
60
+
61
+ 1. Fork this repo to your private space
62
+ 2. Add me to your private repo in your space (`hathawayj`)
63
+ 3. Build your app and Docker container
64
+ 4. Update your `readme.md` with details about your app and how to start it.
65
+ 5. Include a screen shot of your working app in your repository.
66
+ 6. Build your app in Hugging Face using the [ds460 template](https://huggingface.co/spaces/ds460/_template_streamlit_docker).
67
+ - place the app within our ds460 org.
68
+ - Change the title of the App to your name and change the color.
69
+ 7. Submit the link to your repo and Hugging Face App to me in Canvas within your vocabulary/lingo challenge.
70
+
71
+ ## Vocabulary/Lingo Challenge
72
+
73
+ _Within a `.md` file in your repository and as a submitted `.pdf` or `.html` on Canvas, address the following items;_
74
+
75
+ 1. A link to your repo that you have shared with me and a screenshot of your app.
76
+ 2. Explain the added value of using DataBricks in your Data Science process (using text, diagrams, or tables).
77
+ 3. Compare and contrast PySpark to either Pandas or the Tidyverse (using text, diagrams, or tables).
78
+ 4. Explain Docker to somebody intelligent but not a tech person (using text, diagrams, or tables).
79
+
80
+ Your answers should be clear, detailed, and no longer than is needed. Imagine you are responding to a client or as an interview candidate._
81
+
82
+ - _Clear:_ Clean sentences in nicely laid out format.
83
+ - _Detailed:_ You touch on all the critical points of the concept. Speak at a reasonable level.
84
+ - _Brevity:_ Don't ramble. Get to the point, and don't repeat yourself.
85
+
86
+ ## Submission
87
+
88
+ Submit your work on Canvas. It can be a link to your repository or a PDF/HTML file with your vocabulary challenge and a link to your repo included.
89
+
90
+ ## References
91
+
92
+ - [Streamlit Dashboard](https://streamlit.io/)
93
+ - [Docker](https://www.docker.com/)
94
+ - [Dockerfile cheat sheet](https://kapeli.com/cheat_sheets/Dockerfile.docset/Contents/Resources/Documents/index)
95
+ - [Streamlit deploy in Docker](https://docs.streamlit.io/knowledge-base/tutorials/deploy/docker)
96
+ - [Streamlit and Docker](https://maelfabien.github.io/project/Streamlit/#)
97
+ - [open-meteo/python-requests: Open-Meteo Python Library using `requests`](https://github.com/open-meteo/python-requests?tab=readme-ov-file#polars)
98
+ - [openmeteo-requests · PyPI](https://pypi.org/project/openmeteo-requests/)
docker-compose.yaml ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ services:
2
+ streamlit:
3
+ build:
4
+ dockerfile: Dockerfile
5
+ context: .
6
+ container_name: streamlit-app
7
+ cpus: 0.5
8
+ mem_limit: 1024m
9
+ ports:
10
+ - "8501:8501"
11
+ volumes:
12
+ - ".:/app:rw"
requirements.txt ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ pandas
2
+ polars
3
+ streamlit
4
+ plotly
5
+ pytz
6
+ openmeteo_requests
7
+ datetime
8
+ streamlit_letsplot
streamlit_app.py ADDED
@@ -0,0 +1,241 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import openmeteo_requests
2
+ import requests_cache
3
+ import polars as pl
4
+ from retry_requests import retry
5
+ import streamlit as st
6
+ from datetime import datetime, timedelta
7
+ import pytz
8
+ from lets_plot import *
9
+ from streamlit_letsplot import st_letsplot
10
+ import numpy as np
11
+ #API Fetch
12
+ openmeteo = openmeteo_requests.Client()
13
+
14
+ #Empty Frames
15
+ locations = [
16
+ {"name": "Rexburg, Idaho", "latitude": 43.8251, "longitude": -111.7924},
17
+ {"name": "Provo, Utah", "latitude": 40.2338, "longitude": -111.6585},
18
+ {"name": "Laie, Hawaii", "latitude": 21.6210, "longitude": -157.9753}
19
+ ]
20
+ location_names = [location["name"] for location in locations]
21
+ filtered_forecasts = {}
22
+ filtered_histories = {}
23
+ historical_data = []
24
+ forecast_data = []
25
+ timezones = pytz.all_timezones
26
+ def find_max(df):
27
+ daily_highs = df.group_by(by = 'Date').agg(pl.max("Temperature").alias('max'))
28
+ return daily_highs, df
29
+
30
+
31
+ # Date Variables
32
+ today = datetime.today()
33
+ default = today - timedelta(days=14)
34
+
35
+ # Streamlit Variables
36
+ start_date = st.sidebar.date_input(
37
+ 'Select start date:',
38
+ value = default,
39
+ max_value= default
40
+ )
41
+
42
+ selected_timezone = st.sidebar.selectbox(
43
+ 'Choose a timezone:',
44
+ timezones
45
+ )
46
+ end_date = start_date + timedelta(days=14)
47
+
48
+ temperature_option = st.sidebar.selectbox(
49
+ 'Select Temperature Type',
50
+ ('Highest', 'Lowest')
51
+ )
52
+
53
+ city_option = st.sidebar.selectbox(
54
+ 'Select City',
55
+ ('Rexburg, Idaho', 'Provo, Utah', 'Laie, Hawaii')
56
+ )
57
+ # Display the selected date range
58
+ st.sidebar.write(f"Date Range: {start_date} to {end_date}")
59
+
60
+ # Get Forecast Data
61
+ for location in locations:
62
+ url = "https://api.open-meteo.com/v1/forecast"
63
+ params = {
64
+ "latitude": location["latitude"],
65
+ "longitude": location["longitude"],
66
+ "start_date": start_date.strftime('%Y-%m-%d'),
67
+ "end_date": end_date.strftime('%Y-%m-%d'),
68
+ "hourly": "temperature_2m"
69
+ }
70
+
71
+ # Fetch weather data
72
+ responses = openmeteo.weather_api(url, params=params)
73
+ response = responses[0]
74
+ hourly = response.Hourly()
75
+ hourly_temperature_2m = hourly.Variables(0).ValuesAsNumpy()
76
+
77
+ # Convert timestamp to datetime
78
+ start = datetime.fromtimestamp(hourly.Time())
79
+ end = datetime.fromtimestamp(hourly.TimeEnd())
80
+ freq = timedelta(seconds=hourly.Interval())
81
+
82
+ # Create Polars DataFrame
83
+ hourly_data = pl.select(
84
+ date = pl.datetime_range(start, end, freq, closed = "left"),
85
+ temperature_2m = hourly_temperature_2m,
86
+ location = [location["name"]])
87
+
88
+ hourly_dataframe = pl.DataFrame(data = hourly_data)
89
+
90
+
91
+ historical_data.append(hourly_dataframe)
92
+ # Concatenate all DataFrames
93
+ combined_historical = pl.DataFrame(pl.concat(historical_data)).explode('location')
94
+
95
+
96
+
97
+ # Get True Historical Data
98
+ for location in locations:
99
+ url = "https://archive-api.open-meteo.com/v1/archive"
100
+ params = {
101
+ "latitude": location["latitude"],
102
+ "longitude": location["longitude"],
103
+ "start_date": start_date.strftime('%Y-%m-%d'),
104
+ "end_date": end_date.strftime('%Y-%m-%d'),
105
+ "hourly": "temperature_2m"
106
+ }
107
+
108
+ # Fetch weather data
109
+ responses = openmeteo.weather_api(url, params=params)
110
+ response = responses[0]
111
+ hourly = response.Hourly()
112
+ hourly_temperature_2m = hourly.Variables(0).ValuesAsNumpy()
113
+
114
+ # Convert timestamp to datetime
115
+ start = datetime.fromtimestamp(hourly.Time())
116
+ end = datetime.fromtimestamp(hourly.TimeEnd())
117
+ freq = timedelta(seconds=hourly.Interval())
118
+
119
+ # Create Polars DataFrame
120
+ hourly_data = pl.select(
121
+ date = pl.datetime_range(start, end, freq, closed = "left"),
122
+ temperature_2m = hourly_temperature_2m,
123
+ location = [location["name"]])
124
+
125
+ hourly_dataframe = pl.DataFrame(data = hourly_data)
126
+
127
+
128
+ forecast_data.append(hourly_dataframe)
129
+
130
+ combined_forecast = pl.DataFrame(pl.concat(forecast_data)).explode('location')
131
+
132
+ for name in location_names:
133
+ filtered_forecasts[name] = (combined_forecast.filter(pl.col("location") == name).drop('location').rename({'date': 'Date'}).rename({'temperature_2m': 'Temperature'}).with_columns(pl.col('Temperature') * 9/5 + 32))
134
+ filtered_histories[name] = (combined_historical.filter(pl.col("location") == name).drop('location').rename({'date': 'Date'}).rename({'temperature_2m': 'Temperature'}).with_columns(pl.col('Temperature') * 9/5 + 32))
135
+
136
+
137
+ tab1, tab2, tab3 = st.tabs(["Data", "Visualisations", "KPIs"])
138
+
139
+ with tab1:
140
+ st.title("Forecasted Weather vs Actual Weather by City")
141
+ st.header('Forecasts')
142
+ st.markdown("<h2 style='text-align: center; color: white;'>Rexburg</h2>", unsafe_allow_html=True)
143
+
144
+ # Create two columns for Rexburg content
145
+ rexburg_col1, rexburg_col2 = st.columns(2)
146
+
147
+ # Rexburg content
148
+ with rexburg_col1:
149
+ st.write("Forecasts")
150
+ st.dataframe(filtered_forecasts["Rexburg, Idaho"], use_container_width=True, hide_index=True)
151
+
152
+ with rexburg_col2:
153
+ st.write("Historical Data")
154
+ st.dataframe(filtered_histories["Rexburg, Idaho"], use_container_width=True, hide_index=True)
155
+
156
+ # Provo Header
157
+ st.markdown("<h2 style='text-align: center; color: white;'>Provo</h2>", unsafe_allow_html=True)
158
+
159
+ # Create two columns for Provo content
160
+ provo_col1, provo_col2 = st.columns(2)
161
+
162
+ # Provo content
163
+ with provo_col1:
164
+ st.write("Forecasts")
165
+ st.dataframe(filtered_forecasts["Provo, Utah"], use_container_width=True, hide_index=True)
166
+
167
+ with provo_col2:
168
+ st.write("Historical Data")
169
+ st.dataframe(filtered_histories["Provo, Utah"], use_container_width=True, hide_index=True)
170
+
171
+ # Laie Header
172
+ st.markdown("<h2 style='text-align: center; color: white;'>Laie</h2>", unsafe_allow_html=True)
173
+
174
+ # Create two columns for Laie content
175
+ laie_col1, laie_col2 = st.columns(2)
176
+
177
+ # Laie content
178
+ with laie_col1:
179
+ st.write("Forecasts")
180
+ st.dataframe(filtered_forecasts["Laie, Hawaii"], use_container_width=True, hide_index=True)
181
+
182
+
183
+ with laie_col2:
184
+ st.write("Historical Data")
185
+ st.dataframe(filtered_histories["Laie, Hawaii"], use_container_width=True, hide_index=True)
186
+
187
+ with tab2:
188
+ st.header('Visualisations by City')
189
+
190
+
191
+
192
+ st.subheader("Forecasted Data vs. Historical Data")
193
+
194
+ combined_forecast = combined_forecast.rename({'date': 'Date'}).rename({'temperature_2m': 'Temperature'}).with_columns(pl.col('Temperature') * 9/5 + 32).with_columns(pl.col('Date').cast(datetime))
195
+ combined_historical = combined_historical.rename({'date': 'Date'}).rename({'temperature_2m': 'Temperature'}).with_columns(pl.col('Temperature') * 9/5 + 32).with_columns(pl.col('Date').cast(datetime))
196
+
197
+ reg_forecasted = ggplot(combined_forecast, aes(x='Date', y='Temperature', color = 'location')) \
198
+ + geom_line() \
199
+ + facet_wrap('location', ncol = 1) \
200
+ + labs(title = f'Hourly Temperatures',
201
+ x = 'Date', y = 'Temperature (°F)', color = 'City Name') + \
202
+ guides(color="none") + \
203
+ scale_x_datetime(format = '%m/%d')
204
+
205
+ reg_historical = ggplot(combined_historical, aes(x='Date', y='Temperature', color = 'location')) \
206
+ + geom_line() \
207
+ + facet_wrap('location', ncol = 1) \
208
+ + labs(title = f'Hourly Temperatures',
209
+ x = 'Date', y = 'Temperature (°F)', color = 'City Name') + \
210
+ guides(color="none") + \
211
+ scale_x_datetime(format = '%m/%d')
212
+
213
+ st.subheader('Forecasted')
214
+ st_letsplot(reg_forecasted)
215
+ st.subheader('Historical')
216
+ st_letsplot(reg_historical)
217
+ st.subheader('Boxplot of Average Hourly Temperature')
218
+ box_plot = ggplot(combined_forecast, aes(x='location', y='Temperature', color = 'location')) + \
219
+ geom_jitter(alpha = .65) + \
220
+ geom_boxplot(alpha = .9) + \
221
+ labs(title=f'Hourly Temperature Readings',
222
+ x='City', y='Temperature (°F)') + \
223
+ guides(color = "none")
224
+ st_letsplot(box_plot)
225
+ maxes = combined_historical.group_by('location').agg((pl.col('Temperature').max()))
226
+ max_plot = ggplot(maxes, aes(x = 'location', y = 'Temperature', color = 'location')) + \
227
+ geom_bar(stat='identity') + \
228
+ labs(x = 'City Name', y = 'Max Temperature in Month (°F)')
229
+ st.subheader('Maximum Temperature for Period')
230
+ st_letsplot(max_plot)
231
+
232
+ with tab3:
233
+ if temperature_option == 'Highest':
234
+
235
+ highest_temp = combined_historical.filter(pl.col('location') == city_option).select(pl.max('Temperature')).item()
236
+ st.metric(label=f"Highest Temperature in {city_option}", value=f"{round(highest_temp,2)} °F")
237
+ else:
238
+ # Get the lowest temperature and its corresponding date
239
+ lowest_temp = combined_historical.filter(pl.col('location') == city_option).select(pl.min('Temperature')).item()
240
+ st.metric(label=f"Highest Temperature in {city_option}", value=f"{round(lowest_temp,2)} °F")
241
+