The world is shifting from simple AI chatbots answering our queries to full-fledged systems that are capable of so much more. AI Agents can not only answer our queries but can also perform tasks we give them independently, making them much more powerful and useful.
In this tutorial, you’ll build an advanced, web-based agent that serves as your Virtual Study Planner. This AI agent will be able to understand your goals, make decisions, and act to achieve them.
This project goes beyond basic conversation. You’ll learn to build a goal-based agent with two key capabilities:
Memory: The agent will remember your entire conversation history, allowing it to provide follow-up advice and adapt its plans based on your feedback.
Tool Use: The agent will be capable of using a search tool to find relevant online resources, making it a more powerful assistant than one that relies solely on its internal knowledge.
You’ll learn to create a complete system with a simple web UI built with Flask and Tailwind CSS, providing a solid foundation for building even more complex agents in the future. So, let’s get started.
Table of Contents:
Prerequisites
Before following this tutorial, you should have:
Basic Python knowledge
Basics of web development
Python 3+ is installed on your machine
Installed VS Code or another IDE of your choice
Tools You’ll Be Using to Build this Agent
To build this study planner agent, you’ll need a few components:
Google Gemini API: This is the core AI service that provides the generative model. It allows our agent to understand natural language, reason, and generate human-like responses.
Flask: This is a lightweight web framework for Python. We’ll use it to create our web server (that is, the backend). Its primary purpose here is to handle web requests from the user’s browser, process them, and send back a response.
Tailwind CSS: This is a CSS framework for building the user interface (that is, the frontend). Instead of writing custom CSS, you use pre-defined classes like
bg-blue-300
,m-4
, and so on, to style the page directly in your HTML.Python-dotenv: This library helps us manage environment variables.
DuckDuckGo Search: This library provides a simple way to perform real-time web searches. It acts as the “tool” for our AI agent. When a user asks a question that requires external information, our agent can use this tool to find relevant resources on the web and use that information to formulate a response.
Understanding AI Agents
Before jumping into the code, let’s cover the basics so you understand what an AI agent is and what it’s capable of.
What Are AI Agents? How Many Types Are There?
An AI agent is software that can autonomously perform tasks on a user’s behalf. AI agents perceive their surroundings, process information, and act to achieve the user’s goals. Unlike fixed programs, an agent can reason and adapt.
There are a few different types of agents, including:
Simple Reflex (acts on current input, like a thermostat)
Model-Based (uses an internal map, like robot vacuums)
Goal-Based (plans to reach goals, like a study planner)
Utility-Based (chooses best outcomes, like trading bots)
Learning Agents (improve over time, like recommendation systems).
How Are AI Agents Unique Compared to Other AI Tools?
AI agents use technologies like LLMs, but they’re distinct because of their autonomy and ability to act. Let’s understand these different types of AI tools in more detail:
Large Language Models (LLMs): LLMs are the brain of the operation. They’re trained on a very large dataset to understand and process user queries in natural language to generate human-like output. OpenAI’s GPT, Google’s Gemini, and Anthropic’s Claude are all examples of LLMs.
Retrieval-Augmented Generation (RAG): RAG is a process or a technique that allows LLMs to not only get their information from training data but also from external sources, like a database or document library, to answer user queries. While RAG retrieves information, it doesn’t independently decide to perform an action or plan a sequence of steps to achieve a goal.
AI Agents: As explained above, agents are the systems that can perform user tasks using LLMs as their core reasoning engine. An agent’s full architecture allows it to perceive its environment, plan, act, and learn (memory, based on past interactions).
In this tutorial, you are going to use an LLM (Gemini) to reason, as well as a web search engine, DuckDuckGo search, for building the agent. So, now let’s move on to the next step.
How to Set Up Your Environment
Before you can build your Virtual Study Planner AI agent, you’ll need to set up your development environment. Here are the steps you’ll need to follow:
1. Create a Project Directory
First, create a new folder with any name and move to that directory:
mkdir study-planner
<span class="hljs-built_in">cd</span> study-planner
2. Create a Virtual Environment
In Python, it’s always recommended to work in a virtual environment. So, create one and activate it like this:
python -m venv venv
Now activate the virtual environment:
<span class="hljs-comment"># macOS/Linux</span>
<span class="hljs-built_in">source</span> venv/bin/activate
<span class="hljs-comment"># Windows</span>
venvScriptsactivate
3. Install Dependencies
We’ll need a couple of packages or dependencies to build the AI study planner agent, and they include:
flask
: web servergoogle-generativeai
: Gemini clientpython-dotenv
: load GEMINI_API_KEY from .envrequests
: useful HTTP helper (nice to have)duckduckgo-search
: real web search
You can install them with a single command:
pip install flask google-generativeai python-dotenv requests duckduckgo-search
4. Get Your Gemini API Key
Go to Google AI Studio and create a new account (if you don’t have one already).
Next, get yourself a new API key by clicking the Create API Key from the API Keys section.
NOTE: Once the API Key is generated, SAVE it somewhere else. You may not get the same API key again.
5. Add Your Key to the .env
File
Create a .env
file inside backend/
and add your API key.
GEMINI_API_KEY=your_api_key_here
Now you should have set up your development environment successfully. You’re ready to build the Virtual Study Planner AI agent. Let’s start!
How to Build the Real-Time Agent Logic
The core of this project is a continuous loop that accepts user input, maintains a conversation history, and sends that history to the Gemini API to generate a response. This is how we give the agent memory.
Create the Gemini Client (with web search)
Create a new file at backend/gemini_client.py
:
<span class="hljs-comment"># backend/gemini_client.py</span>
<span class="hljs-keyword">import</span> os
<span class="hljs-keyword">from</span> typing <span class="hljs-keyword">import</span> List, Dict
<span class="hljs-keyword">import</span> google.generativeai <span class="hljs-keyword">as</span> genai
<span class="hljs-keyword">from</span> dotenv <span class="hljs-keyword">import</span> load_dotenv
<span class="hljs-keyword">from</span> duckduckgo_search <span class="hljs-keyword">import</span> DDGS
<span class="hljs-comment"># Load environment variables</span>
load_dotenv()
<span class="hljs-comment"># function uses a query string and duckduckgo_search library to perform a web search</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">perform_web_search</span>(<span class="hljs-params">query: str, max_results: int = <span class="hljs-number">6</span></span>) -> List[Dict[str, str]]:</span>
<span class="hljs-string">"""Perform a DuckDuckGo search and return a list of results.
Each result contains: title, href, body.
"""</span>
results: List[Dict[str, str]] = []
<span class="hljs-keyword">try</span>:
<span class="hljs-keyword">with</span> DDGS() <span class="hljs-keyword">as</span> ddgs:
<span class="hljs-keyword">for</span> result <span class="hljs-keyword">in</span> ddgs.text(query, max_results=max_results):
<span class="hljs-comment"># result keys typically include: title, href, body</span>
<span class="hljs-keyword">if</span> <span class="hljs-keyword">not</span> isinstance(result, dict):
<span class="hljs-keyword">continue</span>
title = result.get(<span class="hljs-string">'title'</span>) <span class="hljs-keyword">or</span> <span class="hljs-string">''</span>
href = result.get(<span class="hljs-string">'href'</span>) <span class="hljs-keyword">or</span> <span class="hljs-string">''</span>
body = result.get(<span class="hljs-string">'body'</span>) <span class="hljs-keyword">or</span> <span class="hljs-string">''</span>
<span class="hljs-keyword">if</span> title <span class="hljs-keyword">and</span> href:
results.append({
<span class="hljs-string">'title'</span>: title,
<span class="hljs-string">'href'</span>: href,
<span class="hljs-string">'body'</span>: body,
})
<span class="hljs-keyword">return</span> results
<span class="hljs-keyword">except</span> Exception <span class="hljs-keyword">as</span> e:
print(<span class="hljs-string">f"DuckDuckGo search error: <span class="hljs-subst">{e}</span>"</span>)
<span class="hljs-keyword">return</span> []
<span class="hljs-comment"># A class that manages the interaction with the Gemini API and core agent logic </span>
<span class="hljs-class"><span class="hljs-keyword">class</span> <span class="hljs-title">GeminiClient</span>:</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">__init__</span>(<span class="hljs-params">self</span>):</span>
<span class="hljs-keyword">try</span>:
genai.configure(api_key=os.getenv(<span class="hljs-string">'GEMINI_API_KEY'</span>))
self.model = genai.GenerativeModel(<span class="hljs-string">'gemini-1.5-flash'</span>)
self.chat = self.model.start_chat(history=[])
<span class="hljs-keyword">except</span> Exception <span class="hljs-keyword">as</span> e:
print(<span class="hljs-string">f"Error configuring Gemini API: <span class="hljs-subst">{e}</span>"</span>)
self.chat = <span class="hljs-literal">None</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">generate_response</span>(<span class="hljs-params">self, user_input: str</span>) -> str:</span>
<span class="hljs-string">"""Generate an AI response with optional web search when prefixed.
To trigger web search, start your message with one of:
- "search: <query>"
- "/search <query>"
Otherwise, the model responds directly using chat history.
"""</span>
<span class="hljs-keyword">if</span> <span class="hljs-keyword">not</span> self.chat:
<span class="hljs-keyword">return</span> <span class="hljs-string">"AI service is not configured correctly."</span>
<span class="hljs-keyword">try</span>:
text = user_input <span class="hljs-keyword">or</span> <span class="hljs-string">""</span>
lower = text.strip().lower()
<span class="hljs-comment"># Search trigger</span>
search_query = <span class="hljs-literal">None</span>
<span class="hljs-keyword">if</span> lower.startswith(<span class="hljs-string">"search:"</span>):
search_query = text.split(<span class="hljs-string">":"</span>, <span class="hljs-number">1</span>)[<span class="hljs-number">1</span>].strip()
<span class="hljs-keyword">elif</span> lower.startswith(<span class="hljs-string">"/search "</span>):
search_query = text.split(<span class="hljs-string">" "</span>, <span class="hljs-number">1</span>)[<span class="hljs-number">1</span>].strip()
<span class="hljs-keyword">if</span> search_query:
web_results = perform_web_search(search_query, max_results=<span class="hljs-number">6</span>)
<span class="hljs-keyword">if</span> <span class="hljs-keyword">not</span> web_results:
<span class="hljs-keyword">return</span> <span class="hljs-string">"I could not retrieve web results right now. Please try again."</span>
<span class="hljs-comment"># Build context with numbered references</span>
refs_lines = []
<span class="hljs-keyword">for</span> idx, item <span class="hljs-keyword">in</span> enumerate(web_results, start=<span class="hljs-number">1</span>):
refs_lines.append(<span class="hljs-string">f"[<span class="hljs-subst">{idx}</span>] <span class="hljs-subst">{item[<span class="hljs-string">'title'</span>]}</span> — <span class="hljs-subst">{item[<span class="hljs-string">'href'</span>]}</span>n<span class="hljs-subst">{item[<span class="hljs-string">'body'</span>]}</span>"</span>)
refs_block = <span class="hljs-string">"nn"</span>.join(refs_lines)
system_prompt = (
<span class="hljs-string">"You are an AI research assistant. Use the provided web search results to answer the user query. "</span>
<span class="hljs-string">"Synthesize concisely, cite sources inline like [1], [2] where relevant, and include a brief summary."</span>
)
composed = (
<span class="hljs-string">f"<system>n<span class="hljs-subst">{system_prompt}</span>n</system>n"</span>
<span class="hljs-string">f"<user_query>n<span class="hljs-subst">{search_query}</span>n</user_query>n"</span>
<span class="hljs-string">f"<web_results>n<span class="hljs-subst">{refs_block}</span>n</web_results>"</span>
)
response = self.chat.send_message(composed)
<span class="hljs-keyword">return</span> response.text
<span class="hljs-comment"># Default: normal chat</span>
response = self.chat.send_message(text)
<span class="hljs-keyword">return</span> response.text
<span class="hljs-keyword">except</span> Exception <span class="hljs-keyword">as</span> e:
print(<span class="hljs-string">f"Error generating response: <span class="hljs-subst">{e}</span>"</span>)
<span class="hljs-keyword">return</span> <span class="hljs-string">"I'm sorry, I encountered an error processing your request."</span>
Let’s understand what’s going on in the above code:
The
perform_web_search()
function:We keep a chat session open so the model remembers the conversation.
If a message starts with
search:
or/search
, the DuckDuckGo service is called, gathers a few results, and passes them to Gemini with a short instruction to cite sources.Otherwise, we just send the message as normal.
The
GeminiClient
class:The
GeminiClient
class is designed to connect and talk with Google’s Gemini AI. Inside the__init__
method, it first callsgenai.configure()
with the API key from the environment variables, which basically unlocks access to Gemini’s services.Then,
self.model = genai.GenerativeModel('gemini-1.5-flash')
loads the specific Gemini model, andself.chat = self.model.start_chat(history=[])
starts a new conversation with no previous history. This way, the class is ready to send and receive AI responses.The real action happens in
generate_response()
. If a user’s message begins withsearch:
or/search
, it triggers a DuckDuckGo search usingperform_web_search()
.The results are formatted with titles, links, and snippets, and then passed to Gemini to create a clear, cited answer (you can sanitize the incoming data later by using any package in Python to make it more user-friendly in the frontend).
If no search command is used, it simply chats with Gemini using the given input. Error handling is built in, so instead of breaking, it returns a general safe message.
Create the Flask Backend and Frontend
Next, we’ll set up the Flask web server to connect our agent logic to a simple web interface.
The Flask Backend
Create a new backend
folder inside the study-planner directory, and add a new file app.py
:
<span class="hljs-comment"># backend/app.py</span>
<span class="hljs-keyword">import</span> os
<span class="hljs-keyword">from</span> flask <span class="hljs-keyword">import</span> Flask, render_template, request, jsonify
<span class="hljs-keyword">from</span> gemini_client <span class="hljs-keyword">import</span> GeminiClient
app = Flask(__name__, template_folder=<span class="hljs-string">'../templates'</span>)
client = GeminiClient()
<span class="hljs-meta">@app.route('/')</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">index</span>():</span>
<span class="hljs-keyword">return</span> render_template(<span class="hljs-string">'index.html'</span>)
<span class="hljs-meta">@app.route('/api/chat', methods=['POST'])</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">chat</span>():</span>
payload = request.get_json(silent=<span class="hljs-literal">True</span>) <span class="hljs-keyword">or</span> {}
user_message = payload.get(<span class="hljs-string">'message'</span>, <span class="hljs-string">''</span>).strip()
<span class="hljs-keyword">if</span> <span class="hljs-keyword">not</span> user_message:
<span class="hljs-keyword">return</span> jsonify({<span class="hljs-string">'error'</span>: <span class="hljs-string">'No message provided'</span>}), <span class="hljs-number">400</span>
<span class="hljs-keyword">try</span>:
response_text = client.generate_response(user_message)
<span class="hljs-keyword">return</span> jsonify({<span class="hljs-string">'response'</span>: response_text})
<span class="hljs-keyword">except</span> Exception <span class="hljs-keyword">as</span> e:
<span class="hljs-keyword">return</span> jsonify({<span class="hljs-string">'error'</span>: <span class="hljs-string">'Error generating response'</span>}), <span class="hljs-number">500</span>
<span class="hljs-keyword">if</span> __name__ == <span class="hljs-string">'__main__'</span>:
app.run(debug=<span class="hljs-literal">True</span>)
What it does:
@app.route('/')
: This is the homepage. When a user navigates to the main URL, like,http://localhost:5000
), Flask runs theindex()
function, which simply renders theindex.html
file. This serves the entire user interface to the browser useful when you don’t want to use the command line interface.Next, we have created
@app.route('/api/chat', methods=['POST'])
, the API endpoint. When the user clicks “Send” on the frontend, the JavaScript sends aPOST
request to this URL. Thechat()
function then receives the user’s message, passes it to theGeminiClient
to get a response, and then sends that response back to the frontend as a JSON object.
The Flask Frontend
Create a new folder named templates
in your project’s root directory. Inside it, create a file index.html
.
<span class="hljs-meta"><!DOCTYPE <span class="hljs-meta-keyword">html</span>></span>
<span class="hljs-tag"><<span class="hljs-name">html</span> <span class="hljs-attr">lang</span>=<span class="hljs-string">"en"</span>></span>
<span class="hljs-tag"><<span class="hljs-name">head</span>></span>
<span class="hljs-tag"><<span class="hljs-name">meta</span> <span class="hljs-attr">charset</span>=<span class="hljs-string">"UTF-8"</span> /></span>
<span class="hljs-tag"><<span class="hljs-name">meta</span> <span class="hljs-attr">name</span>=<span class="hljs-string">"viewport"</span> <span class="hljs-attr">content</span>=<span class="hljs-string">"width=device-width, initial-scale=1.0"</span> /></span>
<span class="hljs-tag"><<span class="hljs-name">title</span>></span>AI Study Planner<span class="hljs-tag"></<span class="hljs-name">title</span>></span>
<span class="hljs-tag"><<span class="hljs-name">script</span> <span class="hljs-attr">src</span>=<span class="hljs-string">"https://cdn.tailwindcss.com"</span>></span><span class="hljs-tag"></<span class="hljs-name">script</span>></span>
<span class="hljs-tag"><<span class="hljs-name">style</span>></span><span class="css">
<span class="hljs-selector-tag">body</span> {
<span class="hljs-attribute">background-color</span>: <span class="hljs-number">#f3f4f6</span>;
}
<span class="hljs-selector-class">.chat-container</span> {
<span class="hljs-attribute">max-width</span>: <span class="hljs-number">768px</span>;
<span class="hljs-attribute">margin</span>: <span class="hljs-number">0</span> auto;
<span class="hljs-attribute">display</span>: flex;
<span class="hljs-attribute">flex-direction</span>: column;
<span class="hljs-attribute">height</span>: <span class="hljs-number">100vh</span>;
}
<span class="hljs-selector-class">.typing-indicator</span> {
<span class="hljs-attribute">display</span>: flex;
<span class="hljs-attribute">align-items</span>: center;
<span class="hljs-attribute">padding</span>: <span class="hljs-number">0.5rem</span>;
<span class="hljs-attribute">color</span>: <span class="hljs-number">#6b7280</span>;
}
<span class="hljs-selector-class">.typing-dot</span> {
<span class="hljs-attribute">width</span>: <span class="hljs-number">8px</span>;
<span class="hljs-attribute">height</span>: <span class="hljs-number">8px</span>;
<span class="hljs-attribute">margin</span>: <span class="hljs-number">0</span> <span class="hljs-number">2px</span>;
<span class="hljs-attribute">background-color</span>: <span class="hljs-number">#6b7280</span>;
<span class="hljs-attribute">border-radius</span>: <span class="hljs-number">50%</span>;
<span class="hljs-attribute">animation</span>: typing <span class="hljs-number">1s</span> infinite ease-in-out;
}
<span class="hljs-selector-class">.message-bubble</span> {
<span class="hljs-attribute">padding</span>: <span class="hljs-number">1rem</span>;
<span class="hljs-attribute">border-radius</span>: <span class="hljs-number">1.5rem</span>;
<span class="hljs-attribute">max-width</span>: <span class="hljs-number">80%</span>;
<span class="hljs-attribute">margin-bottom</span>: <span class="hljs-number">1rem</span>;
}
<span class="hljs-selector-class">.user-message</span> {
<span class="hljs-attribute">background-color</span>: <span class="hljs-number">#3b82f6</span>;
<span class="hljs-attribute">color</span>: white;
<span class="hljs-attribute">align-self</span>: flex-end;
}
<span class="hljs-selector-class">.agent-message</span> {
<span class="hljs-attribute">background-color</span>: <span class="hljs-number">#e5e7eb</span>;
<span class="hljs-attribute">color</span>: <span class="hljs-number">#374151</span>;
<span class="hljs-attribute">align-self</span>: flex-start;
}
</span><span class="hljs-tag"></<span class="hljs-name">style</span>></span>
<span class="hljs-tag"></<span class="hljs-name">head</span>></span>
<span class="hljs-tag"><<span class="hljs-name">body</span> <span class="hljs-attr">class</span>=<span class="hljs-string">"bg-gray-100"</span>></span>
<span class="hljs-tag"><<span class="hljs-name">div</span> <span class="hljs-attr">class</span>=<span class="hljs-string">"chat-container"</span>></span>
<span class="hljs-tag"><<span class="hljs-name">header</span>
<span class="hljs-attr">class</span>=<span class="hljs-string">"bg-white shadow-sm p-4 text-center font-bold text-xl text-gray-800"</span>
></span>
AI Study Planner
<span class="hljs-tag"></<span class="hljs-name">header</span>></span>
<span class="hljs-tag"><<span class="hljs-name">main</span> <span class="hljs-attr">id</span>=<span class="hljs-string">"chat-history"</span> <span class="hljs-attr">class</span>=<span class="hljs-string">"flex-1 overflow-y-auto p-4 space-y-4"</span>></span>
<span class="hljs-tag"><<span class="hljs-name">div</span> <span class="hljs-attr">class</span>=<span class="hljs-string">"message-bubble agent-message"</span>></span>
Hello! I'm your AI Study Planner. What topic would you like to study
today?
<span class="hljs-tag"></<span class="hljs-name">div</span>></span>
<span class="hljs-tag"></<span class="hljs-name">main</span>></span>
<span class="hljs-tag"><<span class="hljs-name">footer</span> <span class="hljs-attr">class</span>=<span class="hljs-string">"bg-white p-4"</span>></span>
<span class="hljs-tag"><<span class="hljs-name">div</span> <span class="hljs-attr">class</span>=<span class="hljs-string">"flex items-center"</span>></span>
<span class="hljs-tag"><<span class="hljs-name">input</span>
<span class="hljs-attr">type</span>=<span class="hljs-string">"text"</span>
<span class="hljs-attr">id</span>=<span class="hljs-string">"user-input"</span>
<span class="hljs-attr">class</span>=<span class="hljs-string">"flex-1 p-3 border-2 border-gray-300 rounded-full focus:outline-none focus:border-blue-500"</span>
<span class="hljs-attr">placeholder</span>=<span class="hljs-string">"Type your message..."</span>
/></span>
<span class="hljs-tag"><<span class="hljs-name">button</span>
<span class="hljs-attr">id</span>=<span class="hljs-string">"send-btn"</span>
<span class="hljs-attr">class</span>=<span class="hljs-string">"ml-4 px-6 py-3 bg-blue-500 text-white rounded-full font-semibold hover:bg-blue-600 transition-colors"</span>
></span>
Send
<span class="hljs-tag"></<span class="hljs-name">button</span>></span>
<span class="hljs-tag"></<span class="hljs-name">div</span>></span>
<span class="hljs-tag"></<span class="hljs-name">footer</span>></span>
<span class="hljs-tag"></<span class="hljs-name">div</span>></span>
<span class="hljs-tag"><<span class="hljs-name">script</span>></span><span class="javascript">
<span class="hljs-keyword">const</span> chatHistory = <span class="hljs-built_in">document</span>.getElementById(<span class="hljs-string">"chat-history"</span>);
<span class="hljs-keyword">const</span> userInput = <span class="hljs-built_in">document</span>.getElementById(<span class="hljs-string">"user-input"</span>);
<span class="hljs-keyword">const</span> sendBtn = <span class="hljs-built_in">document</span>.getElementById(<span class="hljs-string">"send-btn"</span>);
<span class="hljs-function"><span class="hljs-keyword">function</span> <span class="hljs-title">addMessage</span>(<span class="hljs-params">sender, text</span>) </span>{
<span class="hljs-keyword">const</span> messageElement = <span class="hljs-built_in">document</span>.createElement(<span class="hljs-string">"div"</span>);
messageElement.classList.add(
<span class="hljs-string">"message-bubble"</span>,
sender === <span class="hljs-string">"user"</span> ? <span class="hljs-string">"user-message"</span> : <span class="hljs-string">"agent-message"</span>
);
messageElement.textContent = text;
chatHistory.appendChild(messageElement);
chatHistory.scrollTop = chatHistory.scrollHeight;
}
<span class="hljs-keyword">async</span> <span class="hljs-function"><span class="hljs-keyword">function</span> <span class="hljs-title">sendMessage</span>(<span class="hljs-params"></span>) </span>{
<span class="hljs-keyword">const</span> message = userInput.value.trim();
<span class="hljs-keyword">if</span> (message === <span class="hljs-string">""</span>) <span class="hljs-keyword">return</span>;
addMessage(<span class="hljs-string">"user"</span>, message);
userInput.value = <span class="hljs-string">""</span>;
<span class="hljs-keyword">try</span> {
<span class="hljs-keyword">const</span> response = <span class="hljs-keyword">await</span> fetch(<span class="hljs-string">"/api/chat"</span>, {
<span class="hljs-attr">method</span>: <span class="hljs-string">"POST"</span>,
<span class="hljs-attr">headers</span>: {
<span class="hljs-string">"Content-Type"</span>: <span class="hljs-string">"application/json"</span>,
},
<span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify({ <span class="hljs-attr">message</span>: message }),
});
<span class="hljs-keyword">const</span> data = <span class="hljs-keyword">await</span> response.json();
<span class="hljs-keyword">if</span> (data.response) {
addMessage(<span class="hljs-string">"agent"</span>, data.response);
} <span class="hljs-keyword">else</span> <span class="hljs-keyword">if</span> (data.error) {
addMessage(<span class="hljs-string">"agent"</span>, <span class="hljs-string">`Error: <span class="hljs-subst">${data.error}</span>`</span>);
} <span class="hljs-keyword">else</span> {
addMessage(<span class="hljs-string">"agent"</span>, <span class="hljs-string">"Unexpected response from server."</span>);
}
} <span class="hljs-keyword">catch</span> (error) {
<span class="hljs-built_in">console</span>.error(<span class="hljs-string">"Error:"</span>, error);
addMessage(<span class="hljs-string">"agent"</span>, <span class="hljs-string">"Sorry, something went wrong. Please try again."</span>);
}
}
sendBtn.addEventListener(<span class="hljs-string">"click"</span>, sendMessage);
userInput.addEventListener(<span class="hljs-string">"keypress"</span>, <span class="hljs-function">(<span class="hljs-params">e</span>) =></span> {
<span class="hljs-keyword">if</span> (e.key === <span class="hljs-string">"Enter"</span>) {
sendMessage();
}
});
</span><span class="hljs-tag"></<span class="hljs-name">script</span>></span>
<span class="hljs-tag"></<span class="hljs-name">body</span>></span>
<span class="hljs-tag"></<span class="hljs-name">html</span>></span>
That’s the entire UI. It’s just one page with a text box and a send button. It contains a simple JavaScript function to handle the chat interaction. Here’s how it works:
When the user types a message and hits “Send,” it:
Takes the message from the input field.
Creates a new
user-message
bubble and displays it.Uses the
fetch()
API to send the message to the backend’s/api/chat
endpoint.Waits for the backend’s response.
Once the response is received, it creates a new
agent-message
bubble and displays the AI’s reply.
How to Test the AI Agent
At this point, your project structure should look like this:
study-planner/
├── backend/
│ ├── .env
│ ├── app.py
│ └── gemini_client.py
└── templates/
└── index.html
Now, navigate to the backend
directory, and run:
<span class="hljs-built_in">cd</span> backend
python app.py
If everything is set up, you’ll see the Flask app start on http://127.0.0.1:5000
or http://localhost:5000
.
Open that URL in your browser. That’s it, you have finally created an AI agent for yourself!
Try out asking normal questions like:
“Make me a 3-week plan to learn Java programming for beginners.”
“Provide me a quiz on AI agents development?”
Or you can also trigger a web search like:
search: resources for java
/search how to prepare frontend coding interviews
When you use the search prefix like above, the agent fetches a handful of links and asks Gemini to synthesize them with short inline citations like [1], [2]. It’s great for quick research summaries.
Wrapping Up
Congratulations! You now have a working study planner agent that remembers your chats and can even look things up online.
From here, you can further enhance this agent by:
Saving user histories in a database.
Adding authentication, handling multiple users.
Connecting calendars or task managers, and much more.
This foundation provides a solid starting point for building even more sophisticated AI agents tailored to your specific needs.
If you found this tutorial helpful and want to discuss AI development or software development, feel free to connect with me on X/Twitter, LinkedIn, or check out my portfolio at Blog. I regularly share insights about AI, development, technical writing, and so on, and would love to see what you build with this foundation.
Happy coding!
Source: freeCodeCamp Programming Tutorials: Python, JavaScript, Git & MoreÂ