Demo of Gemma3 with Function calling capabilities with Gemma3 + Ollama
Decision flow for when Gemma3 uses search vs. built-in knowledge
This project demonstrates function calling capabilities with the Gemma3 large language model using Ollama. It enables the model to perform web searches for current information when responding to user queries.
- Automatic detection of when to use search vs. built-in knowledge
- Integration with Google Search via Serper.dev API
- Interactive chat interface using Gradio
- Visual indicators for search operations
- Python 3.8 or higher
- Ollama installed (with Gemma3 model downloaded)
- Serper.dev API key for Google Search
-
Clone this repository
git clone <repository-url> cd <repository-directory>
-
Create and activate a virtual environment
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install required dependencies
pip install -r requirements.txt
-
Set up your API key
cp .env.example .env
Then edit the
.env
file and add your Serper.dev API key (Get a Serper API key at https://serper.dev)
-
Install Ollama from ollama.ai
-
Pull the Gemma3 model
ollama pull gemma3:27b
-
Ensure your virtual environment is activated
source venv/bin/activate # On Windows: venv\Scripts\activate
-
Run the application
python function-calling-gemma.py
-
Open the provided URL in your browser (typically http://localhost:7860)
- Type your question in the text input field
- Click "Send" or press Enter
- The assistant will:
- Answer directly from its knowledge for pre-2023 or timeless information
- Use search for current events (post-2023)
- Show search queries and results when used
Questions that use direct knowledge:
- "Who won the 2019 FIFA Women's World Cup?"
- "Who won the 2023 Oscar for Best Picture?"
- "How does quantum computing work?"
Questions that use search:
- "Who is the current World Chess Champion?"
- "What are the features of the latest iPhone?"
- "When is the next Google Cloud Next event?"
- If you see an error related to the Serper API, check your API key in the
.env
file - Ensure Ollama is running and the Gemma3 model is properly installed
- Check that all required Python packages are installed correctly
For more articles on AI/ML and Generative AI, follow me on medium Arjun Prabhulal