Skip to content

Demo project showcasing Gemma3 function calling capabilities using Ollama. Enables automatic web searches via Serper.dev for up-to-date information and features an interactive Gradio chat interface.

Notifications You must be signed in to change notification settings

arjunprabhulal/function-calling-gemma3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gemma3 Function Calling Demo

Demo of Gemma3 Function Calling Demo of Gemma3 with Function calling capabilities with Gemma3 + Ollama

Decision Flow for Function Calling Decision flow for when Gemma3 uses search vs. built-in knowledge

This project demonstrates function calling capabilities with the Gemma3 large language model using Ollama. It enables the model to perform web searches for current information when responding to user queries.

Features

  • Automatic detection of when to use search vs. built-in knowledge
  • Integration with Google Search via Serper.dev API
  • Interactive chat interface using Gradio
  • Visual indicators for search operations

Prerequisites

  • Python 3.8 or higher
  • Ollama installed (with Gemma3 model downloaded)
  • Serper.dev API key for Google Search

Installation

  1. Clone this repository

    git clone <repository-url>
    cd <repository-directory>
    
  2. Create and activate a virtual environment

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  3. Install required dependencies

    pip install -r requirements.txt
    
  4. Set up your API key

    cp .env.example .env
    

    Then edit the .env file and add your Serper.dev API key (Get a Serper API key at https://serper.dev)

Setup Ollama with Gemma3

  1. Install Ollama from ollama.ai

  2. Pull the Gemma3 model

    ollama pull gemma3:27b
    

Running the Application

  1. Ensure your virtual environment is activated

    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  2. Run the application

    python function-calling-gemma.py
    
  3. Open the provided URL in your browser (typically http://localhost:7860)

How to Use

  1. Type your question in the text input field
  2. Click "Send" or press Enter
  3. The assistant will:
    • Answer directly from its knowledge for pre-2023 or timeless information
    • Use search for current events (post-2023)
    • Show search queries and results when used

Sample Questions

Questions that use direct knowledge:

  • "Who won the 2019 FIFA Women's World Cup?"
  • "Who won the 2023 Oscar for Best Picture?"
  • "How does quantum computing work?"

Questions that use search:

  • "Who is the current World Chess Champion?"
  • "What are the features of the latest iPhone?"
  • "When is the next Google Cloud Next event?"

Troubleshooting

  • If you see an error related to the Serper API, check your API key in the .env file
  • Ensure Ollama is running and the Gemma3 model is properly installed
  • Check that all required Python packages are installed correctly

Author

For more articles on AI/ML and Generative AI, follow me on medium Arjun Prabhulal

About

Demo project showcasing Gemma3 function calling capabilities using Ollama. Enables automatic web searches via Serper.dev for up-to-date information and features an interactive Gradio chat interface.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages