- Think Ahead With AI
- Posts
- ๐ From Zero to Sentiment Analysis Hero: Deploying Hugging Face with FastAPI and Docker ๐
๐ From Zero to Sentiment Analysis Hero: Deploying Hugging Face with FastAPI and Docker ๐
๐ Transforming AI Magic into Everyday Business Tools ๐
๐Story Highlights ๐
๐ Step-by-step guide to deploy a sentiment analysis model.
๐ ๏ธ Tools used: Hugging Face, FastAPI, Docker.
๐ Ideal for: Business owners, marketers, developers.

๐ Who, What, When, Where, and Why ๐
๐ฅ Who: Business owners, marketers, developers.
๐ก What: Deploying a sentiment analysis model using Hugging Face, FastAPI, and Docker.
๐ฐ๏ธ When: Anytime you want to leverage AI for business insights.
๐ Where: In any business environment or development setting.
โ Why: To turn complex AI technology into accessible, actionable business tools.

โจ Hook: Turning Tech Magic into Business Gold โจ
Ever thought AI was too complex or expensive for your business?
Think again! We're demystifying AI deployment, making it as simple as following a recipe.
Get ready to turn AI magic into business gold.
Imagine using a Hugging Face model to analyze the sentiment of reviews. Traditionally, you'd have to build and fine-tune this model yourself, ensuring it functions correctly. But now, pre-trained Large Language Models (LLMs) make this process effortless.

With the model ready to go, our primary goal is to make it accessible to colleagues without needing them to download or implement it themselves.
We'll achieve this by creating an API endpoint, allowing users to independently call and utilize the model.
In this guide, we'll deploy a sentiment analysis model using Hugging Face, FastAPI, and Docker, demonstrating how to efficiently create a complete end-to-end solution.
Step 1: Choose Your Weapon โ Hugging Face Model Selection ๐ฏ
First up, we need to pick the right tool for the job. Hugging Face offers a treasure trove of pre-trained models that can save you heaps of time.

For sentiment analysis, here's your starter pack:
pip install transformers
pip install torch tensorflow
With the essentials installed, letโs fire up our chosen model:
from transformers import pipeline
# Define the model
pipe = pipeline(model="distilbert/distilbert-base-uncased-finetuned-sst-2-english")
Run this quick test to see the magic:
print(pipe("This tutorial is great!"))
And voilร , you'll get something like this: {'label': 'POSITIVE', 'score':
0.999876856803894}
. To make it even fancier:
def generate_response(prompt: str):
response = pipe(prompt)
label = response[0]["label"]
score = response[0]["score"]
return f"The '{prompt}' input is {label} with a score of {score}"
print(generate_response("This tutorial is great!"))
Step 2: Set Up Your API with FastAPI ๐
FastAPI is the secret sauce that makes your model accessible via a web interface. Hereโs how to get started:

๐ We will build our API using FastAPI, a Python framework designed for creating high-performance web APIs.
Begin by installing the FastAPI library using the pip command and importing it into our environment. Additionally, we'll use the pydantic library to validate and manage our input data types.
This code establishes a functional API that our colleagues can readily utilize. Simply put, it sets up a web service where you can submit text and receive a sentiment analysis using the robust capabilities of the Hugging Face model through FastAPI.
Moving forward, we'll containerize our application for universal execution, ensuring it can run seamlessly across different environments rather than just on our local machines. This enhances portability and simplifies deployment processes.๐๐ ๏ธ
pip install fastapi pydantic
Now, let's create the API:
from fastapi import FastAPI
from pydantic import BaseModel
from transformers import pipeline
pipe = pipeline(model="distilbert/distilbert-base-uncased-finetuned-sst-2-english")
app = FastAPI()
class RequestModel(BaseModel):
input: str
@app.post("/sentiment")
def get_response(request: RequestModel):
prompt = request.input
response = pipe(prompt)
return f"The '{prompt}' input is {label} with a score of {score}"
label = response[0]["label"]
score = response[0]["score"]
This setup provides a simple API that takes your input and returns the sentiment analysis result. Simple, right?
Here's what happens step-by-step in the code: ๐๐ป
๐ฅ Importing Necessary Libraries: The code starts by importing FastAPI and Pydantic, which ensures that the data we receive and send is structured correctly.
๐ฆ Loading the Model: Then we load a pre-trained sentiment analysis model, as we have already done in the first step.
๐ง Setting Up the FastAPI Application:
app = FastAPI()
initializes our FastAPI app, making it ready to handle requests.๐ Defining the Request Model: Using Pydantic, a
RequestModel
class is defined. This class specifies that we expect an input string, ensuring that our API only accepts data in the correct format.๐ช Creating the Endpoint: The
@app.post("/sentiment")
decorator tells FastAPI that this function should be triggered when a POST request is made to the/sentiment
endpoint. Theget_response
function takes aRequestModel
object as input, which contains the text we want to analyze.โ๏ธ Processing the Request: Inside the
get_response
function, the text from the request is extracted and passed to the model (pipe(prompt)
). The model returns a response with the sentiment label (like "POSITIVE" or "NEGATIVE") and a score indicating the confidence of the prediction.๐ Returning the Response: Finally, the function returns a formatted string that includes the input text, the sentiment label, and the confidence score, providing a clear and concise result for the user.
๐ If we execute the code, the API will be available in our local host, as can be observed in the image below.
Step 3: Containerize with Docker for Ultimate Portability ๐ณ
Containerization simplifies application deployment by encapsulating it within a Docker container. Each Docker container runs a specific instance of a Docker image, which includes its own operating system and all essential dependencies.
For instance, you can install Python and all necessary packages within the container, ensuring the application runs consistently across different environments without requiring separate installations.

To deploy our sentiment analysis app in a Docker container, we begin by creating a Docker image. This involves writing a Dockerfile, which serves as a blueprint specifying the exact contents and configuration of the Docker image.
If Docker isn't yet installed on your system, you can easily obtain it from Dockerโs official website. Below is the Dockerfile we'll utilize for this project, aptly named Dockerfile in the repository:
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /sentiment
# Copy the requirements.txt file into the container
COPY requirements.txt .
# Copy the current directory contents into the container
COPY ./app ./app
# Install dependencies
RUN pip install -r requirements.txt
# Expose port 8000
EXPOSE 8000
# Run the application
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
Build and run the Docker image with these commands:
docker build -t sentiment-app .
docker run -p 8000:8000 sentiment-app
And just like that, you have a portable, powerful sentiment analysis API ready to roll!

In summary: ๐
Choosing a Model: Select and set up a suitable Hugging Face pre-trained model for sentiment analysis.
Building an API with FastAPI: Develop an API endpoint with FastAPI for straightforward interaction with the sentiment analysis model.
Containerizing with Docker: Containerize the application using Docker for flexibility and effortless deployment in diverse environments.
Wrapping It All Up ๐
By following these steps, you've transformed complex AI tools into a user-friendly application that can drive your business forward. Whether you're a business owner, marketer, or developer, this guide has equipped you with the knowledge to deploy machine learning models efficiently and effectively.
Why It Matters and What You Should Do ๐
Why It Matters:
๐ Streamlines the deployment of AI models.
๐ Provides actionable insights from data.
๐ Enhances business decision-making processes.
What You Should Do:
๐ ๏ธ Implement this guide to deploy your own sentiment analysis model.
๐ Leverage the power of Hugging Face, FastAPI, and Docker to stay ahead.
๐ง Integrate AI insights into your business strategy for better outcomes.
"Technology, like art, is a soaring exercise of the human imagination."
Now go ahead and let your imagination soarโbring AI magic to your business today! ๐
Generative AI Tools ๐ง
โ๏ธ Jasper - AI writing assistant for marketing and content creation.
โ๏ธ Writesonic - AI copywriting tool for generating marketing content.
โ๏ธ Copy.ai - AI-powered writing tool for generating copy and content.
โ๏ธ Rytr - AI writing assistant for content creation.
โ๏ธ Sudowrite - AI writing tool for creative writing and storytelling.
News ๐ฐ
About Think Ahead With AI (TAWAI) ๐ค

Empower Your Journey With Generative AI.
"You're at the forefront of innovation. Dive into a world where AI isn't just a tool, but a transformative journey. Whether you're a budding entrepreneur, a seasoned professional, or a curious learner, we're here to guide you."
Founded with a vision to democratize Generative AI knowledge,
Think Ahead With AI is more than just a platform.
It's a movement.
Itโs a commitment.
Itโs a promise to bring AI within everyone's reach.
Together, we explore, innovate, and transform.
Our mission is to help marketers, coaches, professionals and business owners integrate Generative AI and use artificial intelligence to skyrocket their careers and businesses. ๐
TAWAI Newsletter By:

Sujata Ghosh
Gen. AI Explorer
โTAWAI is your trusted partner in navigating the AI Landscape!โ ๐ฎ๐ช