MCP ExplorerExplorer

Kafka Mcp Server

@CefBoudon 10 months ago
2 MIT
FreeCommunity
AI Systems
Kafka MCP Server integrates Apache Kafka with natural language interaction.

Overview

What is Kafka Mcp Server

The kafka-mcp-server is a Model Context Protocol (MCP) server that integrates Apache Kafka, allowing users to interact with a Kafka cluster through natural language processing using large language models (LLMs).

Use cases

Use cases for kafka-mcp-server include simplifying data operations for non-technical users, enabling natural language queries for data retrieval, and enhancing the usability of Kafka in applications that require conversational interfaces.

How to use

To use kafka-mcp-server, you need access to a Kafka cluster. You can run the server using Docker or build it locally with Golang. For Docker, pull the Kafka image and run it. For local builds, clone the repository and compile the server using Go.

Key features

Key features of kafka-mcp-server include the ability to list topics, create topics, consume messages, and produce messages. Additional tools may include listing consumer groups and inspecting lag, resetting consumer group offsets, and retrieving topic offsets.

Where to use

kafka-mcp-server can be used in various fields such as data engineering, real-time data processing, and applications requiring natural language interaction with data streams.

Content

Kafka MCP Server

The Kafka MCP Server is a Model Context Protocol (MCP)
server that provides integration with Apache Kafka to enable interaction with a Kafka cluster using natural language (LLMs).

demo

Prerequisites

You will either need Docker or Golang to run the MCP server locally.

Getting Started

You need to have access to a Kafka cluster. Following the quickstart doc:

 docker pull apache/kafka:4.0.0
 docker run -p 9092:9092 apache/kafka:4.0.0

Kafka is available on localhost:9092 now.

Usage with Claude Desktop

Docker:

{
  "mcpServers": {
    "kafka": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "KAFKA_MCP_BOOTSTRAP_SERVERS",
        "ghcr.io/cefboud/kafka-mcp-server"
      ],
      "env": {
        "KAFKA_MCP_BOOTSTRAP_SERVERS": "localhost:9092"
      }
    }
  }
}

Building locally

cd <workdir>
git clone https://github.com/CefBoud/kafka-mcp-server.git
cd kafka-mcp-server
go build -o kafka-mcp-server  cmd/kafka-mcp-server/main.go 

Options:

      --bootstrap-servers string   Comma-separated list of the Kafka servers to connect to.
      --enable-command-logging     When enabled, the server will log all command requests and responses to the log file
      --log-file string            Path to log file
      --read-only                  Restrict the server to read-only operations

All options can be passed as environment variables, uppercased, with hyphens replaced by underscores, and prefixed with MCP_KAFKA_ e.g., --bootstrap-servers becomes MCP_KAFKA_BOOTSTRAP_SERVERS.

Available MCP Tools

  • [x] List topics
  • [x] Create topic
  • [x] Consuming messages.
  • [x] Produce messages.
  • [x] Describe the clusters (list of brokers and controller)
  • [x] List consumer groups and their lag.
  • [x] Get topic’s earliest and latest offsets (GetOffsetShell)
  • [ ] Reset consumer group offsets.
  • [ ] Kafka Connect ??
  • [ ] Schema Registry ??

🔀 MultiplexTool

demo MultiplexTool

Running many sequential tools, especially when each depends on the output of the previous, can be tedious and time-consuming. This requires multiple round-trips between the client and server/

MultiplexTool solves this by allowing the client to batch a list of tool calls into a single request, executing them in order. It supports dynamic dependencies between tools by letting you reference earlier outputs using prompt-based placeholders.

If a tool input depends on a previous result, the client uses the PROMPT_ARGUMENT: format to generate that input dynamically via a prompt to an LLM (currently only Gemini is supported).
Example:
"userId": "PROMPT_ARGUMENT: the ID of the created user"

CLI Flags:

  • --enable-multiplex: Enables multiplexing of tool calls.
  • --multiplex-model: Specifies the model (e.g. gemini) used to infer PROMPT_ARGUMENTs. Requires GEMINI_API_KEY env variable.
{
  "mcpServers": {
    "kafka": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "KAFKA_MCP_BOOTSTRAP_SERVERS",
        "ghcr.io/cefboud/kafka-mcp-server",
        "--enable-multiplex",
        "--multiplex-model",
        "gemini"
      ],
      "env": {
        "KAFKA_MCP_BOOTSTRAP_SERVERS": "localhost:9092",
        "GEMINI_API_KEY": "....."
      }
    }
  }
}

Credits

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers