MCP ExplorerExplorer

Mcp Server Mariadb Vector

@DavidRamosSalon 17 days ago
1 LGPL-2.1
FreeCommunity
AI Systems
MCP server for MariaDB

Overview

What is Mcp Server Mariadb Vector

The mcp-server-mariadb-vector is a MariaDB MCP server that enables LLM agents to interact with a MariaDB database featuring vector support, allowing users to utilize a natural language interface for data storage and interaction.

Use cases

Use cases include providing contextual information from knowledge bases during conversations with LLM agents, storing and querying past interactions with LLM agents, and enhancing applications that require semantic understanding of data.

How to use

To use the mcp-server-mariadb-vector, ensure you have a running MariaDB instance with vector support (version 11.7 or higher). Clone the repository and utilize the provided MCP tools to create and manage vector stores, add documents, and perform semantic searches.

Key features

Key features include vector store management (create, delete, and list vector stores), document management (add documents with metadata and perform semantic searches), and embedding provider capabilities using OpenAI’s embedding models.

Where to use

The mcp-server-mariadb-vector can be used in various fields such as data science, natural language processing, and any application requiring efficient data storage and retrieval through a conversational interface.

Content

mcp-server-mariadb-vector

The MariaDB Vector MCP server provides tools that LLM agents can use to interact with a MariaDB database with vector support, providing users with a natural language interface to store and interact with their data. Thanks to the Model Context Protocol (MCP), this server is compatible with any MCP client, including those provided by applications like Claude Desktop and Cursor/Windsurf, as well as LLM Agent frameworks like LangGraph and PydanticAI.

Using the MariaDB Vector MCP server, users can for example:

  • Provide context from a knowledge-base to their conversations with LLM agents
  • Store and query their conversations with LLM agents

Features

  • Vector Store Management

    • Create and delete vector stores in a MariaDB database
    • List all vector stores in a MariaDB database
  • Document Management

    • Add documents with optional metadata to a vector store
    • Query a vector store using semantic search
  • Embedding Provider

    • Use OpenAI’s embedding models to embed documents

MCP Tools

  • mariadb_create_vector_store: Create a vector store in a MariaDB database
  • mariadb_delete_vector_store: Delete a vector store in a MariaDB database
  • mariadb_list_vector_stores: List all vector stores in a MariaDB database
  • mariadb_insert_documents: Add documents with optional metadata to a vector store
  • mariadb_search_vector_store: Query a vector store using semantic search

Setup

Note: From here on, it is assumed that you have a running MariaDB instance with vector support (version 11.7 or higher). If you don’t have one, you can quickly spin up a MariaDB instance using Docker:

docker run -p 3306:3306 --name mariadb-instance -e MARIADB_ROOT_PASSWORD=password -e MARIADB_DATABASE=database_name mariadb:11.7

First clone the repository:

git clone https://github.com/DavidRamosSal/mcp-server-mariadb-vector.git

There are two ways to run the MariaDB Vector MCP server: as a Python package using uv or as a Docker container built from the provided Dockerfile.

Requirements for running the server using uv

Requirements for running the server as a Docker container

Configuration

The server needs to be configured with the following environment variables:

Name Description Default Value
MARIADB_HOST host of the running MariaDB database 127.0.0.1
MARIADB_PORT port of the running MariaDB database 3306
MARIADB_USER user of the running MariaDB database None
MARIADB_PASSWORD password of the running MariaDB database None
MARIADB_DATABASE name of the running MariaDB database None
EMBEDDING_PROVIDER provider of the embedding models openai
EMBEDDING_MODEL model of the embedding provider text-embedding-3-small
OPENAI_API_KEY API key for OpenAI’s platform None

Running the server using uv

Using uv, you can add a .env file to the root of the cloned repository with the environment variables and run the server with the following command:

uv run --dir path/to/mcp-server-mariadb-vector/ --env-file path/to/mcp-server-mariadb-vector/.env mcp_server_mariadb_vector

The dependencies will be installed automatically. An optional --transport argument can be added to specify the transport protocol to use. The default value is stdio.

Running the server as a Docker container

Build the Docker container from the root directory of the cloned repository by running the following command:

docker build -t mcp-server-mariadb-vector .

Then run the container (replace with your own configuration):

docker run -p 8000:8000 \
  --add-host host.docker.internal:host-gateway \
  -e MARIADB_HOST="host.docker.internal" \
  -e MARIADB_PORT="port" \
  -e MARIADB_USER="user" \
  -e MARIADB_PASSWORD="password" \
  -e MARIADB_DATABASE="database" \
  -e EMBEDDING_PROVIDER="openai" \
  -e EMBEDDING_MODEL="embedding-model" \
  -e OPENAI_API_KEY="your-openai-api-key" \
  mcp-server-mariadb-vector

The server will be available at http://localhost:8000/sse, using the SSE transport protocol. Make sure to leave MARIADB_HOST set to host.docker.internal if you are running the MariaDB database as a Docker container on your host machine.

Integration with Claude Desktop | Cursor | Windsurf

Claude Desktop, Cursor and Windsurf can run and connect to the server automatically using stdio transport. To do so, add the following to your configuration file (claude_desktop_config.json for Claude Desktop, mcp.json for Cursor or mcp_config.json for Windsurf):

{
  "mcpServers": {
    "mariadb-vector": {
      "command": "uv",
      "args": [
        "run",
        "--directory",
        "path/to/mcp-server-mariadb-vector/",
        "--env-file",
        "path/to/mcp-server-mariadb-vector/.env",
        "mcp-server-mariadb-vector"
      ]
    }
  }
}

Alternatively, Cursor and Windsurf can connect to an already running server on your host machine (e.g. if you are running the server as a Docker container) using SSE transport. To do so, add the following to the corresponding configuration file:

Tools

No tools

Comments