MCP ExplorerExplorer

Gemini Claude Code Mcp

@zerubeuson 14 days ago
1ย MIT
FreeCommunity
AI Systems
๐Ÿ”„ Enable Claude Code to Harness ๐Ÿ”ฎ Gemini for Ultra-Large Context Workloads ๐Ÿง โšก Let Claude tap into Geminiโ€™s extended context window for smarter, bigger, and faster code reasoning.

Overview

What is Gemini Claude Code Mcp

Gemini-Claude Code MCP is a server that enables Claude Code to utilize Googleโ€™s Gemini models for processing ultra-large context workloads, allowing for smarter and faster code reasoning with an extended context window of up to 2 million tokens.

Use cases

Use cases include analyzing large repositories of code, generating documentation from extensive codebases, and performing multi-file analysis tasks that exceed Claudeโ€™s native context limits.

How to use

To use Gemini-Claude Code MCP, integrate it with Claude Code via the MCP protocol. This allows Claude to send requests to the MCP server, which then communicates with the Gemini API to handle large context processing.

Key features

Key features include an extended context of 2 million tokens, hybrid intelligence combining Claudeโ€™s reasoning with Geminiโ€™s processing capabilities, improved performance by offloading context-heavy tasks, and seamless integration within Claude Code.

Where to use

Gemini-Claude Code MCP is ideal for software development, data analysis, and any field requiring extensive codebase processing, large documentation handling, or complex multi-file analysis.

Content

๐Ÿ”„ Gemini-Claude Code MCP Server

๐Ÿ”ฎ Enable Claude Code to Harness Gemini for Ultra-Large Context Workloads ๐Ÿง โšก

Let Claude tap into Geminiโ€™s extended context window for smarter, bigger, and faster code reasoning.

๐ŸŽฏ Project Overview

This MCP (Model Context Protocol) server bridges Claude Code with Googleโ€™s Gemini models, enabling Claude to leverage Geminiโ€™s massive context window (up to 1M tokens) for processing large codebases, extensive documentation, and complex multi-file analysis tasks that would otherwise exceed Claudeโ€™s native context limits.

Key Benefits

  • ๐Ÿš€ Extended Context: Access Geminiโ€™s 2M token context window for massive codebases
  • ๐Ÿง  Hybrid Intelligence: Combine Claudeโ€™s reasoning with Geminiโ€™s large-scale processing
  • โšก Performance: Offload context-heavy operations to Gemini while keeping Claude responsive
  • ๐Ÿ”ง Seamless Integration: Works directly within Claude Code via MCP protocol

๐Ÿ—๏ธ Architecture

System Components

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”         โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”         โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚   Claude Code   โ”‚ <-----> โ”‚   MCP Server     โ”‚ <-----> โ”‚  Gemini API     โ”‚
โ”‚                 โ”‚   MCP   โ”‚                  โ”‚  HTTP   โ”‚                 โ”‚
โ”‚  (Reasoning &   โ”‚         โ”‚  (Bridge Layer)  โ”‚         โ”‚ (Large Context  โ”‚
โ”‚   Execution)    โ”‚         โ”‚                  โ”‚         โ”‚   Processing)   โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜         โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜         โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Core Modules

  1. MCP Server (mcp_server/server.py)

    • Handles MCP protocol communication with Claude Code
    • Routes requests between Claude and Gemini
    • Manages session state and context switching
  2. Gemini Service (services/gemini_manager.py)

    • Manages Gemini API connections using the Google AI Python SDK
    • Handles context chunking and streaming
    • Optimizes token usage and caching
  3. Context Tools (tools/)

    • analyze_large_context: Process entire repositories or documentation sets
    • summarize_codebase: Generate intelligent summaries of large projects
    • cross_file_analysis: Analyze dependencies and relationships across many files
    • context_search: Search through massive contexts efficiently
    • code_generation: Generate code with full project context awareness

๐Ÿ› ๏ธ Technical Implementation

MCP Tools Available

1. analyze_large_context

Processes large amounts of code or documentation that exceed Claudeโ€™s context window.

2. summarize_codebase

Creates intelligent summaries of entire codebases.

{
  "name": "summarize_codebase",
  "parameters": {
    "path": "/path/to/codebase",
    "focus_areas": [
      "optional areas of interest"
    ],
    "detail_level": "high|medium|low"
  }
}

3. cross_file_analysis

Analyzes relationships and dependencies across multiple files.

{
  "name": "cross_file_analysis",
  "parameters": {
    "files": [
      "file patterns"
    ],
    "analysis_type": "dependencies|interfaces|data_flow|security"
  }
}

4. context_search

Search through massive contexts using Geminiโ€™s understanding.

5. code_generation

Generate code with full project context awareness.

{
  "name": "code_generation",
  "parameters": {
    "task": "generation task description",
    "context_files": [
      "relevant files for context"
    ],
    "style_guide": "optional style preferences"
  }
}

Context Management Strategy

  1. Smart Chunking: Automatically splits large contexts into manageable chunks
  2. Context Caching: Caches processed contexts to avoid redundant API calls
  3. Progressive Loading: Loads context progressively based on relevance
  4. Token Optimization: Intelligently manages token usage across both models

๐Ÿ“ฆ Installation

Prerequisites

  • Python 3.11+
  • Claude Code with MCP support
  • Google Cloud account with Gemini API access
  • Google AI Python SDK (installed automatically with dependencies)

Setup Steps

  1. Clone the repository

    git clone https://github.com/yourusername/gemini-claude-code-mpc.git
    cd gemini-claude-code-mpc
    
  2. Install dependencies

    uv sync
    
  3. Configure API credentials

    # Set up Gemini API key
    export GOOGLE_API_KEY="your-gemini-api-key"
    
    # Or use Google Cloud authentication
    export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
    
  4. Configure Claude Code
    Add to your Claude Code settings:

๐Ÿš€ Usage Examples

Example 1: Analyzing a Large Codebase

Claude: "Use Gemini to analyze the entire React codebase and identify performance bottlenecks"

The MCP server will:

1. Gather all relevant files from the React codebase
2. Send them to Gemini for analysis
3. Return structured insights about performance issues

Example 2: Cross-Repository Analysis

Claude: "Compare the authentication implementations across our microservices"

The MCP server will:

1. Collect auth-related code from multiple repositories
2. Use Gemini to analyze patterns and differences
3. Provide comprehensive comparison results

Example 3: Documentation Generation

Claude: "Generate comprehensive API documentation for this entire project"

The MCP server will:

1. Scan all code files for API endpoints and interfaces
2. Use Gemini to understand the full context
3. Generate detailed, context-aware documentation

๐Ÿ”ง Configuration

Environment Variables

  • GOOGLE_API_KEY: Gemini API key
  • GEMINI_MODEL: Model to use (default: gemini-1.5-pro-002)
  • MAX_CONTEXT_SIZE: Maximum context size in tokens (default: 2000000)
  • CACHE_ENABLED: Enable context caching (default: true)
  • CACHE_TTL: Cache time-to-live in seconds (default: 3600)

Advanced Configuration

Create a config.yaml file:

gemini:
  model: gemini-1.5-pro-002
  max_tokens: 2000000
  temperature: 0.1

cache:
  enabled: true
  ttl: 3600
  max_size: 1GB

processing:
  chunk_size: 100000
  overlap: 1000
  parallel_chunks: 4

๐Ÿ“š SDK Documentation

This project uses the official Google AI Python SDK for Gemini. For detailed information about the SDK:

๐Ÿค Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

# Install development dependencies
uv sync

# Run tests
pytest

# Run linting
ruff check .

# Format code
ruff check --fix .

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

Tools

No tools

Comments