MCP ExplorerExplorer

Onboarding Mcp

@tonytimoon 13 days ago
1 MIT
FreeCommunity
AI Systems
Orellis is an MCP server for querying local Python codebases using natural language.

Overview

What is Onboarding Mcp

Orellis is a Model Context Protocol (MCP) server designed for engineers to query and explore their local Python codebase using natural language. It securely processes code files locally without leaving the user’s machine.

Use cases

Use cases include generating codebase overviews for new team members, answering specific code-related questions, and providing guided assistance for navigating complex projects.

How to use

To use onboarding-mcp, install the required dependencies, set up the Ollama CLI, pull the necessary model, and configure the server. Users can then query their codebase or generate guided overviews using specific commands.

Key features

Key features include embedding and searching code with FAISS, local LLM inference via Ollama, two main MCP tools for querying and walkthroughs, and async support for faster responses.

Where to use

onboarding-mcp is ideal for software development environments, particularly for teams onboarding new developers or for individual engineers looking to understand large codebases.

Content

Orellis: Local Codebase AI Assistant

Orellis is a Model Context Protocol (MCP) server that lets engineers query and explore their local Python codebase via natural language. It embeds code files, indexes them with FAISS, and runs a local LLM (Ollama) on CPU for secure, private onboarding assistance.
All inference remains CPU optimized and runs on your local machine—your codebase never leaves your laptop


🚀 Features

  • Embeddings & Search: Uses sentence-transformers + FAISS to embed and retrieve the most relevant code chunks.
  • Local LLM Inference: Runs your quantized code model (e.g. deepseek-coder:1.3b-instruct) locally via Ollama.
  • MCP Server: Exposes two tools over the MCP protocol:
    • ask_codebase(project_path, question): Query the codebase.
    • onboarding_walkthrough(project_path): Generate a guided overview of every file.
  • Async Support: Parallelized LLM calls for fast walkthroughs.

📋 Prerequisites

  1. Python 3.10+
  2. Ollama CLI (for local model host):
    # macOS/Linux
    brew install ollama/ollama/ollama
    # Windows
    choco install ollama
    
  3. Ollama Model: Pull your code model once:
    ollama pull deepseek-coder:1.3b-instruct
    
  4. Python Dependencies:
    git clone <this-repo>
    cd <this-repo>/src
    python -m venv .venv
    source .venv/bin/activate    # or .venv\Scripts\activate on Windows
    pip install -r requirements.txt
    

💻 Running with Claude Desktop

  1. Install Claude Desktop

  2. Setting Up The Configuration

    • Start with making config file:
    mcp install server.py
    
    • Change the contents of the config file:
    • Open Claude Desktop and go to File → Settings → Developer.
    • Click on Edit Config
    • Open claude_desktop_config.json file
    • Copy this in:
    {
      "mcpServers": {
        "orellis": {
          "command": "<full path to the project you cloned>\\.venv\\Scripts\\python.exe",
          "args": [
            "<full path to the project you cloned>\\src\\server.py"
          ]
        }
      }
    }
  3. Interact

    • Restart claude desktop completely

    • Wait till you see this:
      Screenshot 2025-04-22 121927

    • Provide full path of python projecct and ask a question about it or ask for a walkthrough

    • After you provide the path once you dont need to provide it again only if you want to switch projects


🔄 Changing the Model

Orellis uses a local Ollama‑hosted model (deepseek-coder:1.3b-instruct) by default. To switch models:

  1. Pull or quantize a new model:
    ollama pull <model-name>
    
  2. Update llm.py:
    • Change MODEL to your new <model-name>.
  3. Restart the server:
    • Restart claude desktop completely

Tools

No tools

Comments