MCP ExplorerExplorer

Claude Lmstudio Bridge V2

@infinitimelesson 9 months ago
5 MIT
FreeCommunity
AI Systems
A bridge MCP server that allows Claude to communicate with locally running LLM models via LM Studio

Overview

What is Claude Lmstudio Bridge V2

Claude-LMStudio-Bridge_V2 is a Model Control Protocol (MCP) server that facilitates communication between Claude and locally running LLM models via LM Studio. It allows users to send prompts to these models and receive responses.

Use cases

Use cases include comparing responses from different models, utilizing specialized local models for specific tasks, and running queries without exceeding API quotas, making it ideal for developers and researchers.

How to use

To use Claude-LMStudio-Bridge_V2, start by cloning the repository and setting up a virtual environment. Install the required packages, start LM Studio with your preferred model, run the bridge server, and enable the MCP server in Claude’s interface to connect to the local bridge.

Key features

Key features include the ability to compare Claude’s responses with other models, access specialized local models for specific tasks, run queries with limited Claude API quota, and keep sensitive queries entirely local.

Where to use

Claude-LMStudio-Bridge_V2 can be used in various fields such as AI research, natural language processing, and any application requiring interaction with local LLM models for enhanced performance and privacy.

Content

Claude-LMStudio-Bridge

A simple Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.

Overview

This bridge enables Claude to send prompts to locally running models in LM Studio and receive their responses. This can be useful for:

  • Comparing Claude’s responses with other models
  • Accessing specialized local models for specific tasks
  • Running queries even when you have limited Claude API quota
  • Keeping sensitive queries entirely local

Prerequisites

Installation

  1. Clone this repository:

    git clone https://github.com/infinitimeless/Claude-LMStudio-Bridge_V2.git
    cd Claude-LMStudio-Bridge_V2
    
  2. Create a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  3. Install the required packages (choose one method):

    Using requirements.txt:

    pip install -r requirements.txt
    

    Or directly install dependencies:

    pip install requests "mcp[cli]" openai anthropic-mcp
    

Usage

  1. Start LM Studio and load your preferred model.

  2. Ensure LM Studio’s local server is running (usually on port 1234 by default).

  3. Run the bridge server:

    python lmstudio_bridge.py
    
  4. In Claude’s interface, enable the MCP server and point it to your locally running bridge.

  5. You can now use the following MCP tools in your conversation with Claude:

    • health_check: Check if LM Studio API is accessible
    • list_models: Get a list of available models in LM Studio
    • get_current_model: Check which model is currently loaded
    • chat_completion: Send a prompt to the current model

Example

Once connected, you can ask Claude to use the local model:

Claude, please use the LM Studio bridge to ask the local model: "What's your opinion on quantum computing?"

Claude will use the chat_completion tool to send the query to your local model and display the response.

Configuration

By default, the bridge connects to LM Studio at http://localhost:1234/v1. If your LM Studio instance is running on a different port, modify the LMSTUDIO_API_BASE variable in lmstudio_bridge.py.

Troubleshooting

If you encounter issues with dependencies, try installing them directly:

pip install requests "mcp[cli]" openai anthropic-mcp

For detailed installation instructions and troubleshooting, see the Installation Guide.

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers