MCP ExplorerExplorer

Mcp Cli Container

@ai-localon 19 days ago
1 MIT
FreeCommunity
AI Systems
A Containerfile to run mcp-cli (https://github.com/chrishayuk/mcp-cli)

Overview

What is Mcp Cli Container

mcp-cli-container is a Containerfile designed to run the mcp-cli tool within a containerized environment, specifically utilizing Podman for testing purposes. It is optimized for use with NVIDIA GPUs and the NVIDIA container toolkit.

Use cases

Use cases for mcp-cli-container include testing machine learning models in a controlled environment, running multiple server configurations for different applications, and facilitating development workflows that require GPU resources.

How to use

To use mcp-cli-container, clone the repository using ‘git clone https://github.com/ai-local/mcp-cli-container.git’. If using rootless Podman, enable the container_use_devices SELinux boolean. Build the container with ‘podman build . --tag mcp-cli --device nvidia.com/gpu=all’ and run it using ‘podman run --device nvidia.com/gpu=all --name mcp-cli -it mcp-cli /bin/bash’. Inside the container, configure the MCP servers in the server_config.json file and run the mcp-cli command.

Key features

Key features of mcp-cli-container include support for NVIDIA GPUs, the ability to run multiple MCP servers, and customizable server configurations through the server_config.json file.

Where to use

mcp-cli-container is suitable for use in environments that require testing and running machine learning models, particularly those that leverage GPU acceleration and containerization technologies.

Content

Video walk through: https://www.youtube.com/watch?v=OxX4NZO6j6I

This is an example of how to run the https://github.com/chrishayuk/mcp-cli within a container for testing, using Podman.

For use with an NVIDIA GPU with the NVIDIA container toolkit installed and configured for use with Podman.

Clone this repository:
git clone https://github.com/ai-local/mcp-cli-container.git

If running rootless Podman, enable the container_use_devices SELinux boolean:
sudo setsebool -P container_use_devices=true

Build container with:
podman build . --tag mcp-cli --device nvidia.com/gpu=all

Run container with:
podman run --device nvidia.com/gpu=all --name mcp-cli -it mcp-cli /bin/bash

Within container, run the following to start ollama and download a model of your choice:

ollama serve >/dev/null 2>&1  &
ollama list
ollama pull granite3.2:8b-instruct-q8_0

Within container, configure MCP server servers in the server_config.json file. In this example, I’ll configure MCP servers from https://github.com/MladenSU/cli-mcp-server and https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem

Note: these MCP servers are configured below to provide full access to the root directory and the ability to run any command, with any command flag. Update this configuration as needed if you would like to restrict what the MCP servers have access to:

    "cli-mcp-server": {
      "command": "uvx",
      "args": [
        "cli-mcp-server"
      ],
      "env": {
        "ALLOWED_DIR": "/",
        "ALLOWED_COMMANDS": "all",
        "ALLOWED_FLAGS": "all",
        "MAX_COMMAND_LENGTH": "1024",
        "COMMAND_TIMEOUT": "30"
      }
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/"
      ]
    }

Run mcp-cli and specify the server you would like to use (run twice if it doesn’t recognize the MCP server the first time). For cli-mcp-server example:
mcp-cli chat --server cli-mcp-server --provider ollama --model granite3.2:8b-instruct-q8_0

For filesystem example (run twice if it doesn’t recognize the MCP server the first time):
mcp-cli chat --server filesystem --provider ollama --model granite3.2:8b-instruct-q8_0

Tools

No tools

Comments