Vishal1806 commited on
Commit
b7959ea
1 Parent(s): 9fe613b

Upload 3 files

Browse files
Files changed (3) hide show
  1. Dockerfile +45 -0
  2. app.py +12 -0
  3. requirements.txt +14 -0
Dockerfile ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ FROM python:3.9
2
+ WORKDIR /code
3
+ COPY ./requirements.txt /code/requirements.txt
4
+ RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
5
+ RUN useradd user
6
+ USER user
7
+ ENV HOME = /home/user \
8
+ PATH=/home/user/.local/bin:$PATH
9
+ WORKDIR $HOME/app
10
+ COPY --chown=user . $HOME/app
11
+ CMD ["uvicorn", "app:app","--host", "0.0.0.0", "--port", "3000"]
12
+
13
+
14
+ # FROM python:3.9
15
+ # Specifies that the base image is Python 3.9.
16
+
17
+ # WORKDIR /code
18
+ # Sets /code as the working directory for the rest of the Dockerfile instructions, if /code is not present, it is created. Used in docker files.
19
+
20
+ # COPY requirements.txt /code/
21
+ # Copies requirements.txt from the host machine to the /code directory in the container. The path is relative to /code due to the WORKDIR instruction.
22
+
23
+ # RUN pip install --no-cache-dir --upgrade -r requirements.txt
24
+ # Runs pip install within the /code directory, installing Python packages specified in requirements.txt.
25
+
26
+ # COPY . /code/
27
+ # Copies the rest of the application files from the host machine into the /code directory in the container.
28
+
29
+ # RUN useradd user
30
+ # Adds a new user named user to the container. This step is used to create a non-root user for better security practices.
31
+
32
+ # USER user
33
+ # Switches the user context to user.
34
+
35
+ # ENV HOME=/home/user \ PATH=/home/user/.local/bin:$PATH
36
+ # Setting the path as the local/bin where all executables like pip are found. Without this we have to mention path to pip everytime else computer does not know what pip is. But now, with the default executables path, it knows where to look pip for. This tells the computer, "Whenever you look for commands to run, also check the /home/user/.local/bin folder."
37
+
38
+ # WORKDIR $HOME/app
39
+ # Changes the working directory to $HOME/app (which resolves to /home/user/app). This sets the directory where the application files will be copied and run.
40
+
41
+ # COPY --chown=user . $HOME/app
42
+ # Copies the application files from your local machine to the $HOME/app directory in the container, setting the ownership to user. --chown=user ensures that the files are owned by the user rather than the root, which helps maintain the security context.
43
+
44
+ # CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "3000"]
45
+ # The CMD command tells Docker to start your web application with Uvicorn, make it available on the network, and listen on port 3000.
app.py ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from fastapi import FastAPI
2
+ from transformers import pipeline
3
+
4
+ app= FastAPI()
5
+ pipe = pipeline("text2text-generation", model="google/flan-t5-small")
6
+ @app.get("/")
7
+ def home():
8
+ return {"message":"Hello World"}
9
+ @app.get("/generate")
10
+ def generate(text:str):
11
+ output=text(pipe)
12
+ return {"output":output[0]['generated_text']}
requirements.txt ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ fastapi==0.74.*
2
+ requests==2.27.*
3
+ uvicorn[standard]==0.17.*
4
+ torch==2.4.*
5
+ transformers==4.*
6
+ sentencepiece==0.2.*
7
+
8
+
9
+ # FastAPI Web framework for building APIs with Python Build the web server to handle API requests and responses.
10
+ # Requests Library for making HTTP requests in Python Send HTTP requests to external services or APIs.
11
+ # Uvicorn ASGI server for serving FastAPI applications Serve the FastAPI app and manage asynchronous operations.
12
+ # SentencePiece Tokenization and detokenization library for NLP Preprocess text data by tokenizing inputs and detokenizing outputs.
13
+ # Torch 1.11.* Deep learning framework for building and training neural networks Develop and train generative models.
14
+ # Transformers Library for pre-trained NLP models and tools Integrate pre-trained generative models for text generation and other NLP tasks.