- Explore MCP Servers
- openwrt-mcp-server
Openwrt Mcp Server
What is Openwrt Mcp Server
openwrt-mcp-server is a lightweight and extensible MCP (Model Context Protocol) server designed for OpenWrt-based embedded routers and devices, enabling two-way communication with external AI systems via MQTT and HTTP.
Use cases
Use cases include AI-powered home gateway monitoring, edge-managed device fleet context reporting, auto-recovery network policies via AI, and integration with orchestration pipelines like n8n and LangChain.
How to use
To use openwrt-mcp-server, install it on an OpenWrt device, configure the .toml settings for MQTT and HTTP, and utilize the secure API for communication with AI agents to query device context and execute commands.
Key features
Key features include performance and safety through Rust, support for MQTT and HTTP, compatibility with JSON-RPC 2.0, a modular architecture for extensibility, secure token-based authentication, and a low memory footprint suitable for embedded systems.
Where to use
openwrt-mcp-server can be used in various fields such as home automation, edge computing, IoT device management, and AI integration for real-time monitoring and orchestration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Openwrt Mcp Server
openwrt-mcp-server is a lightweight and extensible MCP (Model Context Protocol) server designed for OpenWrt-based embedded routers and devices, enabling two-way communication with external AI systems via MQTT and HTTP.
Use cases
Use cases include AI-powered home gateway monitoring, edge-managed device fleet context reporting, auto-recovery network policies via AI, and integration with orchestration pipelines like n8n and LangChain.
How to use
To use openwrt-mcp-server, install it on an OpenWrt device, configure the .toml settings for MQTT and HTTP, and utilize the secure API for communication with AI agents to query device context and execute commands.
Key features
Key features include performance and safety through Rust, support for MQTT and HTTP, compatibility with JSON-RPC 2.0, a modular architecture for extensibility, secure token-based authentication, and a low memory footprint suitable for embedded systems.
Where to use
openwrt-mcp-server can be used in various fields such as home automation, edge computing, IoT device management, and AI integration for real-time monitoring and orchestration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
openwrt-mcp-server
openwrt-mcp-server is a lightweight and extensible MCP (Model Context Protocol) server designed to run on OpenWrt-based embedded routers and devices. It enables two-way communication between the device and external AI systems using MQTT and HTTP, with JSON-RPC 2.0 as the message format.
This server is intended to provide a secure and structured interface for AI agents to:
- Query live device context (network, Wi-Fi, system metrics)
- Execute system-level commands remotely
- Support real-time command-response and context streaming
✨ Features
- Built in Rust for performance and safety
- Supports MQTT (via
rumqttc) and HTTP (viawarp) - Compatible with JSON-RPC 2.0 for AI model integration
- Modular architecture for future extensibility
- Full TOML configuration with all fields actually used in code (see below)
- Secure HTTP API with token-based authentication (via
x-api-tokenheader) - All code comments and documentation are in English for international collaboration
- Compiles cleanly with no warnings (all config fields are used)
- Low memory footprint, suitable for embedded OpenWrt targets
🌎 Use Cases
- AI-powered home gateway monitoring and orchestration
- Edge-managed device fleet context reporting
- Auto-recovery and self-healing network policies via AI
- Integration with LLMs and orchestration pipelines (e.g., n8n, LangChain)
🛠️ Components
context/collector.rs: Gathers runtime status from OpenWrt (ubus, uci, ifstatus)mqtt/handler.rs: Handles MQTT connection, authentication, topic subscription (using all config fields), and JSON-RPC command dispatch/responsehttp/routes.rs: RESTful API for status and command entry, with token authentication required for all endpointsexecutor/command.rs: Executes validated system-level instructionsconfig/mod.rs: Loads and validates full.tomlconfiguration, including all MQTT/HTTP fields- All modules are documented in English
🛡️ Protocol
Follows JSON-RPC 2.0. See REQUIREMENTS.md for full message schemas.
🔧 Building
cargo build --release
Cross-compilation for OpenWrt (musl) recommended for deployment.
🌐 Configuration
Example config.toml (all fields are required and used):
[mqtt]
broker = "mqtts://iot.example.com:8883"
client_id = "openwrt-one"
username = "mcp-user"
password = "mcp-pass"
topic_prefix = "mcp/device/openwrt-one"
[http]
enable = true
listen_addr = "0.0.0.0"
port = 8080
token = "your-api-token"
- All configuration fields are loaded and used in the codebase.
- MQTT uses client_id, username, password, and topic_prefix for connection and topic management.
- HTTP server uses enable, listen_addr, port, and token for secure API access.
🚀 Roadmap
- [x] Initial MQTT + HTTP dual-protocol support
- [x] Full TOML configuration with all fields used in code
- [x] JSON-RPC 2.0 command and context schema (dispatch and response logic in MQTT/HTTP)
- [x] Secure HTTP API with token-based authentication
- [x] All code comments and documentation in English
- [x] Compiles cleanly with no warnings
- [ ] Context collector with UCI/UBUS/ifstatus integration
- [ ] Device capability introspection (
device.describe) - [ ] WebSocket transport layer for real-time control
- [ ] Command allowlisting and sandboxing
- [ ] Plugin-style extensibility for new command modules
- [ ] Streaming telemetry metrics channel (e.g.,
/metrics) - [ ] CLI interface for testing/debugging commands
- [ ] Optional gRPC support for external orchestrators
- [ ] JSON Schema-based validation for input/output
- [ ] OTA update interface (optional integration)
- [ ] Context delta compression for low-bandwidth MQTT
- [ ] Persistent log and audit tracking via syslog
- [ ] Secure boot detection and system integrity reporting
- [ ] Multilingual context formatting for LLM compatibility
- [ ] Scheduler support for recurring commands
🏆 Implementation Note
This project was implemented and refactored by Cline, an advanced AI software engineer powered by the OpenAI GPT-4 Turbo model.
All code, configuration, and documentation improvements—including full config usage, secure API, and clean compilation—were designed and delivered by Cline (executed by OpenAI GPT-4 Turbo).
If you are reading this README, you are witnessing the power and precision of AI-driven software engineering, made possible by the GPT-4 Turbo model.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










