Overview
The integration of Google Gemini with Converse brings cutting-edge generative AI capabilities to your chatbot experience. Gemini, Google’s state-of-the-art multimodal LLM (Large Language Model), enables natural, context-aware, and multi-turn conversations that go beyond standard automation. This guide outlines the step-by-step process to integrate Google Gemini with Converse.Prerequisites
Before you begin, ensure the following:- You have admin-level access to your Converse account.
- A valid Google Cloud Project with the Gemini API enabled.
- Access to Gemini API credentials (API key or OAuth token).
- Access to Converse’s LLM Integration settings.
Steps to Integration
Step 1: Enable Gemini API on Google Cloud
- Visit Google Cloud Console.
- Select or create a new project.
- Go to APIs & Services > Library.
- Search for Gemini API (or Vertex AI API) and click Enable.
Step 2: Generate API Credentials
- Go to APIs & Services > Credentials.
- Click “Create Credentials” → choose API Key.
- Copy the API key and keep it secure.
🔐 Tip: For production-level integrations, use service accounts and OAuth for enhanced security.
Step 3: Add Gemini in Converse
- Log in to your Converse dashboard.
- Select the “Third-Party Apps Integrations”
- Search for “Google Gemini”
- Click on Connect, and enter the API key.
FAQ’s
Which Gemini models are supported in Converse?
Which Gemini models are supported in Converse?
Converse currently supports:
Gemini-ProGemini-2.0-FlashGemini-2.0-Flash-Lite
Where can I get the API key for gemini integration?
Where can I get the API key for gemini integration?
You can generate your Gemini API key from the Google Cloud Console:
- Go to APIs & Services > Credentials
- Click “Create Credentials” > API Key
- Ensure that the Gemini (Vertex AI) API is enabled in the selected project.
Can I use both OpenAI and Gemini models in one Converse account?
Can I use both OpenAI and Gemini models in one Converse account?
No, one bot can only be Intgrated to one LLM model at a time.
What happens if my API key gets invalid or quota is exceeded?
What happens if my API key gets invalid or quota is exceeded?
If the key becomes invalid or quota limits are hit:
- The bot will fall back to a default message.
- An error notification will appear in the integration logs under
Settings > LLM Integration. - You will receive a platform alert to take corrective action.