- Explore MCP Servers
- fabric-rti-mcp
Fabric Rti Mcp
What is Fabric Rti Mcp
The Microsoft Fabric RTI MCP Server is a Model Context Protocol (MCP) server designed for seamless communication between AI agents and Microsoft Fabric Real-Time Intelligence (RTI) services. It allows AI agents to perform data querying and analysis using natural language commands that are translated into KQL operations.
Use cases
Users can leverage the Fabric RTI MCP Server to execute various analytics tasks, such as retrieving databases and tables from Eventhouse, sampling data from specific tables, and conducting trend analyses over historical data. It enables AI agents to categorize command executions based on risk levels, thereby enhancing data insights.
How to use
To get started, install the server from PyPI using pip or through VS Code commands. Configure the required settings in your settings.json
file to specify the server command and connect to the Kusto service URI and database. Ensure that relevant authentication methods are set up for secure access.
Key features
Key features include smart JSON communication, natural language command translation to KQL operations, intelligent parameter suggestions and auto-completion, and consistent error handling. The server also integrates with various authentication methods for seamless user experience.
Where to use
The Fabric RTI MCP Server is suitable for environments where AI agents interact with Microsoft Fabric’s data services. It is particularly useful in data analytics, business intelligence, and scenarios that require real-time data querying and analysis, offering capabilities in Eventhouse and Azure Data Explorer.
Overview
What is Fabric Rti Mcp
The Microsoft Fabric RTI MCP Server is a Model Context Protocol (MCP) server designed for seamless communication between AI agents and Microsoft Fabric Real-Time Intelligence (RTI) services. It allows AI agents to perform data querying and analysis using natural language commands that are translated into KQL operations.
Use cases
Users can leverage the Fabric RTI MCP Server to execute various analytics tasks, such as retrieving databases and tables from Eventhouse, sampling data from specific tables, and conducting trend analyses over historical data. It enables AI agents to categorize command executions based on risk levels, thereby enhancing data insights.
How to use
To get started, install the server from PyPI using pip or through VS Code commands. Configure the required settings in your settings.json
file to specify the server command and connect to the Kusto service URI and database. Ensure that relevant authentication methods are set up for secure access.
Key features
Key features include smart JSON communication, natural language command translation to KQL operations, intelligent parameter suggestions and auto-completion, and consistent error handling. The server also integrates with various authentication methods for seamless user experience.
Where to use
The Fabric RTI MCP Server is suitable for environments where AI agents interact with Microsoft Fabric’s data services. It is particularly useful in data analytics, business intelligence, and scenarios that require real-time data querying and analysis, offering capabilities in Eventhouse and Azure Data Explorer.
Content
🎯 Overview
A Model Context Protocol (MCP) server implementation for Microsoft Fabric Real-Time Intelligence (RTI).
This server enables AI agents to interact with Fabric RTI services by providing tools through the MCP interface, allowing for seamless data querying and analysis capabilities.
[!NOTE]
This project is in Public Preview and implementation may significantly change prior to General Availability.
🔍 How It Works
The Fabric RTI MCP Server creates a seamless integration between AI agents and Fabric RTI services through:
- 🔄 Smart JSON communication that AI agents understand
- 🏗️ Natural language commands that get translated to Kql operations
- 💡 Intelligent parameter suggestions and auto-completion!
- ⚡ Consistent error handling that makes sense
✨ Supported Services
- Eventhouse (Kusto): Execute KQL queries against Microsoft Fabric RTI Eventhouse and Azure Data Explorer(ADX).
🚧 Coming soon
- Activator
- Eventstreams
- Other RTI items
🔍 Explore your data
- “Get databases in Eventhouse’”
- “Sample 10 rows from table ‘StormEvents’ in Eventhouse”
- “What can you tell me about StormEvents data?”
- “Analyze the StormEvents to come up with trend analysis ocross past 10 years of data”
- “Analyze the commands in ‘CommandExecution’ table and categorize them as low/medium/high risks”
Available tools
- List databases
- List tables
- Get schema for a table
- Sample rows from a table
- Execute query
- Ingest a csv
Getting Started
Prerequisites
- Install either the stable or Insiders release of VS Code:
- Install the GitHub Copilot and GitHub Copilot Chat extensions
- Install
uv
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
or, check here for other install options
- Open VS Code in an empty folder
Install from PyPI (Pip)
The Fabric RTI MCP Server is available on PyPI, so you can install it using pip. This is the easiest way to install the server.
From VS Code
1. Open the command palette (Ctrl+Shift+P) and run the command `MCP: Add Server`
2. Select install from Pip
3. When prompted, enter the package name `microsoft-fabric-rti-mcp`
4. Follow the prompts to install the package and add it to your settings.json file
The process should end with the below settings in your settings.json
file.
settings.json
🔧 Manual Install (Install from source)
- Make sure you have Python 3.10+ installed properly and added to your PATH.
- Clone the repository
- Install the dependencies (
pip install .
oruv tool install .
) - Add the settings below into your vscode
settings.json
file. - Change the path to match the repo location on your machine.
- Change the cluster uri in the settings to match your cluster.
🐛 Debugging the MCP Server locally
Assuming you have python installed and the repo cloned:
Install locally
pip install -e ".[dev]"
Configure
Add the server to your
{ "mcp": { "servers": { "local-fabric-rti-mcp": { "command": "python", "args": [ "-m", "fabric_rti_mcp.server" ] } } } }
Attach the debugger
Use the Python: Attach
configuration in your launch.json
to attach to the running server.
Once VS Code picks up the server and starts it, navigate to it’s output:
- Open command palette (Ctrl+Shift+P) and run the command
MCP: List Servers
- Navigate to
local-fabric-rti-mcp
and selectShow Output
- Pick up the process id (PID) of the server from the output
- Run the
Python: Attach
configuration in yourlaunch.json
file, and paste the PID of the server in the prompt - The debugger will attach to the server process, and you can start debugging
🧪 Test the MCP Server
- Open GitHub Copilot in VS Code and switch to Agent mode
- You should see the Fabric RTI MCP Server in the list of tools
- Try a prompt that tells the agent to use the Eventhouse tools, such as “List my Kusto tables”
- The agent should be able to use the Fabric RTI MCP Server tools to complete your query
🔑 Authentication
The MCP Server seamlessly integrates with your host operating system’s authentication mechanisms, making it super easy to get started! We use Azure Identity under the hood via DefaultAzureCredential
, which tries these credentials in order:
- Environment Variables (
EnvironmentCredential
) - Perfect for CI/CD pipelines - Visual Studio (
VisualStudioCredential
) - Uses your Visual Studio credentials - Azure CLI (
AzureCliCredential
) - Uses your existing Azure CLI login - Azure PowerShell (
AzurePowerShellCredential
) - Uses your Az PowerShell login - Azure Developer CLI (
AzureDeveloperCliCredential
) - Uses your azd login - Interactive Browser (
InteractiveBrowserCredential
) - Falls back to browser-based login if needed
If you’re already logged in through any of these methods, the Fabric RTI MCP Server will automatically use those credentials.
🛡️ Security Note
Your credentials are always handled securely through the official Azure Identity SDK - we never store or manage tokens directly.
MCP as a phenomenon is very novel and cutting-edge. As with all new technology standards, consider doing a security review to ensure any systems that integrate with MCP servers follow all regulations and standards your system is expected to adhere to. This includes not only the Azure MCP Server, but any MCP client/agent that you choose to implement down to the model provider.
👥 Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a
Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide
a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions
provided by the bot. You will only need to do this once across all repos using our CLA.
🤝 Code of Conduct
This project has adopted the Microsoft Open Source Code of Conduct.
For more information see the Code of Conduct FAQ or
contact [email protected] with any additional questions or comments.
Data Collection
The software may collect information about you and your use of the software and send it to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may turn off the telemetry as described in the repository. There are also some features in the software that may enable you and Microsoft to collect data from users of your applications. If you use these features, you must comply with applicable law, including providing appropriate notices to users of your applications together with a copy of Microsoft’s privacy statement. Our privacy statement is located at https://go.microsoft.com/fwlink/?LinkID=824704. You can learn more about data collection and use in the help documentation and our privacy statement. Your use of the software operates as your consent to these practices.
Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft
trademarks or logos is subject to and must follow
Microsoft’s Trademark & Brand Guidelines.
Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.
Any use of third-party trademarks or logos are subject to those third-party’s policies.