TIPS ON HOW TO MAKE A CHATBOT USING FREE GEMINI API KEYS
Source: Dev.to
Overview
The Gemini API provides cost‑effective access to Google’s most capable AI models. This guide outlines the architecture and implementation steps for building a chatbot while keeping your API keys secure.
Architecture
| Component | Description |
|---|---|
| Client | User enters a prompt in a React/HTML interface. |
| Server (Backend) | A Django or Node.js environment receives the prompt, attaches the secret API key, and forwards the request to the Gemini API. |
| Gemini API | Google processes the natural‑language request and returns a JSON response. |
| Display | The backend sends the response text back to the frontend to be rendered in a chat bubble. |
Setup
-
Obtain an API key from the Google AI Studio.
-
Create a Python environment and install the required library:
pip install -q -U google-generativeai -
Store the API key in a
.envfile and add the file to.gitignoreto prevent accidental exposure.
Initializing the Model
# service.py or views.py
import google.generativeai as genai
import os
# Securely load your API key
genai.configure(api_key=os.environ["GEMINI_API_KEY"])
# Initialize the model (Gemini 2.5 Flash is recommended for speed)
model = genai.GenerativeModel('gemini-2.5-flash')
Creating a Chat Session
The standard prompt‑response flow is stateless. To maintain conversation history, use the .start_chat() method.
# Start a chat session with an empty history
chat = model.start_chat(history=[])
def get_chatbot_response(user_input: str) -> str:
"""Send a message to the model and return the text response."""
response = chat.send_message(user_input, stream=False)
return response.text
Safety Settings
Gemini includes built‑in filters for harassment and dangerous content. You can adjust these settings in the configuration if the model is overly restrictive for your use case.
System Instructions (Persona)
Define a system instruction to give the chatbot a consistent persona:
system_prompt = (
"You are a professional first‑aid assistant. "
"Provide clear, step‑by‑step emergency instructions."
)
# Example of initializing with a system instruction
model = genai.GenerativeModel(
'gemini-2.5-flash',
system_instruction=system_prompt
)
Error Handling
Wrap API calls in try‑except blocks to handle rate limits, quota errors, and network timeouts.
def safe_get_response(user_input: str) -> str:
try:
return get_chatbot_response(user_input)
except Exception as e:
# Log the error and return a friendly message
print(f"Error: {e}")
return "Sorry, I'm experiencing technical difficulties. Please try again later."
Security Best Practices
- Never call the Gemini API directly from the frontend (JavaScript). Exposing the API key in the browser allows anyone to steal it and consume your quota.
- Always route requests through your backend, where the API key is stored securely in environment variables.
- Keep the
.envfile out of version control (.gitignore).
By following these guidelines, you can build a functional and secure chatbot using free Gemini API keys.