Spaces:
Running
Running
File size: 6,527 Bytes
771015e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 |
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"source": [
"# Gradio Example Notebook"
],
"metadata": {
"id": "-Ix16ey2erLl"
}
},
{
"cell_type": "markdown",
"source": [
" A separate notebook that demonstrates various Gradio components. If needed,use hardcoded data to show the functionality of the Gradio components.\n"
],
"metadata": {
"id": "3dxNBkEhexES"
}
},
{
"cell_type": "markdown",
"source": [
"## Code"
],
"metadata": {
"id": "YlfGrVVBe16M"
}
},
{
"cell_type": "markdown",
"source": [
"### Libraries\n",
"\n",
"Start by installing and importing necessary libraries"
],
"metadata": {
"id": "7lICoKpGe4kP"
}
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "JBJQeBn0YrwE"
},
"outputs": [],
"source": [
"!pip install gradio\n",
"!pip install wget\n",
"!pip install transformers"
]
},
{
"cell_type": "code",
"source": [
"import gradio as gr # Gradio library to create an interactive interface\n",
"from transformers import pipeline # Transformers libraries which imports pipeline to use Hugging-Face models\n",
"import pandas as pd # Pandas library for data manipulation and analysis"
],
"metadata": {
"id": "OTYay5Uge-Jw"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"### Initialize Sentiment Analyzer and Define Function\n",
"\n",
"Here the sentiment-analyzer is initialized using the Hugging-Face pipeline, Also the function that will be used by the Gradio"
],
"metadata": {
"id": "Dx7kdVxhfQ0l"
}
},
{
"cell_type": "code",
"source": [
"# Initialize the analyzer\n",
"\n",
"# Loads a pretrained model for the English language\n",
"analyzer = pipeline(\"sentiment-analysis\")"
],
"metadata": {
"collapsed": true,
"id": "RwSoWhwosP3a"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"# Define Function\n",
"\n",
"def sentiment_analysis(filename):\n",
" with open(filename, \"r\") as fn: # Open the file and read the sentences\n",
" sentences = fn.readlines()\n",
" result = analyzer(sentences) # Store the analyzer results for each sentence\n",
" df = pd.DataFrame(result) # Convert the results in a formatted table\n",
" df['sentences'] = sentences # add column for sentences\n",
" return df"
],
"metadata": {
"id": "dGn2a4A4fQGl"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"### Build Gradio Interface\n",
"\n",
"\n",
"Here, The Gradio interface is set up using the following components:\n",
"\n",
"**A file uploader** allows user to upload a text file which contains the sentences to be analyzed\n",
"\n",
"**DataFrame Output** Displays program's output in a structured table format\n",
"\n",
"**Title and Description**: Provides clear title and description for the interface"
],
"metadata": {
"id": "IHHnKudGfFtI"
}
},
{
"cell_type": "code",
"source": [
"#Create the gradio interface\n",
"demo = gr.Interface(\n",
" fn=sentiment_analysis, # Function used by gradio\n",
" inputs=gr.File(label=\"Upload a file (txt)\"), # User inputs (file)\n",
" outputs=gr.Dataframe(label='Results'), # Program's output (DataFrame)\n",
" title='Sentiment-Analysis',\n",
" description=\"Gradio interface that allows users to upload a text file containing sentences to be analyzed, the sentences will be classified and results will be in a formatted table\"\n",
")\n",
"demo.launch(debug=True) # \"Debug=True\" Displays inerface errors"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 680
},
"id": "xE3M_HFgfN4L",
"outputId": "d647960b-41c8-4a7d-b67d-bb3055c77721"
},
"execution_count": 9,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Setting queue=True in a Colab notebook requires sharing enabled. Setting `share=True` (you can turn this off by setting `share=False` in `launch()` explicitly).\n",
"\n",
"Colab notebook detected. This cell will run indefinitely so that you can see errors and logs. To turn off, set debug=False in launch().\n",
"Running on public URL: https://934478ea31903b3a25.gradio.live\n",
"\n",
"This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)\n"
]
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.HTML object>"
],
"text/html": [
"<div><iframe src=\"https://934478ea31903b3a25.gradio.live\" width=\"100%\" height=\"500\" allow=\"autoplay; camera; microphone; clipboard-read; clipboard-write;\" frameborder=\"0\" allowfullscreen></iframe></div>"
]
},
"metadata": {}
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"Keyboard interruption in main thread... closing server.\n",
"Killing tunnel 127.0.0.1:7860 <> https://934478ea31903b3a25.gradio.live\n"
]
},
{
"output_type": "execute_result",
"data": {
"text/plain": []
},
"metadata": {},
"execution_count": 9
}
]
}
]
} |