MCP ExplorerExplorer

Grafanallm Alertanalyzer

@sunwuparkon a year ago
1 MIT
FreeCommunity
AI Systems
GrafanaLLM-AlertAnalyzer is an intelligent platform that combines the power of Grafana's Machine Configuration Protocol (MCP) with Large Language Models (LLMs) to revolutionize system alert analysis

Overview

What is Grafanallm Alertanalyzer

GrafanaLLM-AlertAnalyzer is an intelligent platform that integrates Grafana’s Machine Configuration Protocol (MCP) with Large Language Models (LLMs) to enhance system alert analysis.

Use cases

Use cases include automated troubleshooting of system alerts, proactive monitoring of system health, and generating actionable insights for IT teams.

How to use

To use GrafanaLLM-AlertAnalyzer, install the required components, set up Grafana MCP, and configure the system with your Grafana and OpenAI API keys. Once set up, it automatically processes alerts and provides detailed analyses.

Key features

Key features include AI-powered analysis, Grafana MCP integration, automated root cause analysis, intelligent solution recommendations, email notifications, and a modular architecture for extensibility.

Where to use

GrafanaLLM-AlertAnalyzer can be used in IT operations, system monitoring, and any environment where alert analysis and problem-solving are critical.

Content

GrafanaLLM-AlertAnalyzer

AI-Powered Alert Analysis System Using Grafana MCP and Large Language Models

Key FeaturesInstallationUsageComponentsContributingLicense

Overview

GrafanaLLM-AlertAnalyzer is an intelligent platform that combines the power of Grafana’s Machine Configuration Protocol (MCP) with Large Language Models (LLMs) to revolutionize system alert analysis. This tool automatically processes incoming alerts from Grafana, analyzes metrics data via MCP, and uses AI to generate comprehensive problem analyses with actionable solutions.

Key Features

  • AI-Powered Analysis: Leverages OpenAI’s LLMs to interpret alert data and provide human-like reasoning about system issues
  • Grafana MCP Integration: Directly accesses Grafana metrics and data sources through the Machine Configuration Protocol
  • Automated Root Cause Analysis: Identifies underlying issues beyond surface-level symptoms
  • Intelligent Solution Recommendation: Suggests specific technical actions based on AI analysis of metrics data
  • Email Notifications: Sends detailed analysis reports via email with formatted problem, cause, and solution sections
  • Modular Architecture: Easily extensible design for adding new features or supporting additional monitoring systems

Installation

Requirements

  • Python 3.9+
  • Docker (optional)
  • Grafana instance with API key
  • OpenAI API key
  • Grafana MCP binary files

Grafana MCP Setup

You need to download and install the appropriate Grafana MCP binary files for your architecture:

Visit the Grafana MCP releases page: https://github.com/grafana/mcp-grafana/releases
Download the binary package that matches your system architecture
Extract the files to your project directory (recommended: app/bin/mcp-grafana/)
Make the binary executable with chmod +x

Quick Start

# Clone the repository
git clone https://github.com/your-name/GrafanaLLM-AlertAnalyzer.git
cd GrafanaLLM-AlertAnalyzer

# Set up development environment
make setup

# Configure environment variables
cp .env.example .env
# Edit the .env file with your Grafana and OpenAI API keys

# Run the application
make run

Docker Deployment

# Build Docker image
make docker-build

# Run Docker container
make docker-run

Usage

Configuring Grafana Webhook

  1. Go to ‘Alerting’ > ‘Contact points’ in Grafana admin
  2. Click ‘Add contact point’
  3. Select ‘Webhook’ type
  4. Enter http://your-host:8000/alert in the URL field
  5. Set HTTP method to ‘POST’
  6. Save

API Endpoints

  • POST /alert - Receive alert data from Grafana and trigger AI analysis
  • GET /health - Check service status

Email Notification Setup

Configure the following settings in your .env file to enable email notifications with the AI analysis results:

SMTP_SERVER=smtp.example.com
SMTP_PORT=587
[email protected]
SMTP_PASSWORD=your-password
[email protected],[email protected]

Components

GrafanaLLM-AlertAnalyzer consists of the following main components:

  • API Server: FastAPI-based web server that receives webhook notifications from Grafana
  • LLM Agent: LangChain-based ReAct agent that uses OpenAI models to analyze alerts
  • MCP Client: Grafana MCP adapter that allows direct querying of Grafana data sources
  • Notification Service: Email notification service that sends formatted analysis results

Development Guide

Project Structure

grafanallm-alertanalyzer/
│
├── app/                      # Application code
│   ├── api/                  # API endpoints
│   ├── conf/                 # Configs and settings
│   ├── services/             # Business logic
│   │   ├── agent.py          # LLM agent implementation
│   │   ├── alert_analyzer.py # Alert analysis orchestration
│   │   └── notification.py   # Email notification service
│   └── utils/                # Utility functions
│
├── tests/                    # Test code
├── Dockerfile                # Docker image definition
├── Makefile                  # Build and development scripts
├── main.py                   # Application entry point
├── mcp-grafana               # mcp-grafana binary file
└── requirements.txt          # Dependency packages

Development Commands

# Install dependencies and set up development environment
make setup

# Run tests
make test

# Check code with linters
make lint

# Format code
make format

# Run the application
make run

How It Works

  1. When a Grafana alert is triggered, it sends a webhook notification to the /alert endpoint
  2. The application extracts alert information and launches the LLM-powered analysis
  3. The LLM agent uses Grafana MCP to query relevant metrics and data sources
  4. The agent analyzes the data, identifies the problem, determines the root cause, and suggests solutions
  5. Results are formatted and sent via email to the configured recipients
  6. The API returns the analysis results to the caller

Contributing

GrafanaLLM-AlertAnalyzer is an open-source project, and contributions are welcome:

  1. Fork the project
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers