- Explore MCP Servers
- databricks-mcp
Databricks Mcp
What is Databricks Mcp
Databricks MCP is a Model Context Protocol server designed for seamless interaction with Databricks functionalities using the databricks-sdk. It enables AI agents and compatible applications to manage workspaces, compute resources, and data interactions effectively.
Use cases
Use cases for Databricks MCP include managing machine learning workflows, executing SQL queries on data warehouses, automating job executions, and facilitating collaboration in data projects through effective workspace management.
How to use
To use Databricks MCP, ensure you have Python 3.10 or higher, install Poetry, and configure Databricks authentication. Clone the repository, install dependencies using Poetry, and set up environment variables in a .env file for local development.
Key features
Key features of Databricks MCP include workspace management (notebooks, files, repos, secrets), compute management (clusters, SQL warehouses), data interaction (SQL execution, catalog browsing), AI/ML workflow management (MLflow, Model Serving, Vector Search), and job execution & management.
Where to use
Databricks MCP can be used in various fields such as data science, machine learning, and AI development, particularly where integration with Databricks services is required for enhanced data processing and model management.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Databricks Mcp
Databricks MCP is a Model Context Protocol server designed for seamless interaction with Databricks functionalities using the databricks-sdk. It enables AI agents and compatible applications to manage workspaces, compute resources, and data interactions effectively.
Use cases
Use cases for Databricks MCP include managing machine learning workflows, executing SQL queries on data warehouses, automating job executions, and facilitating collaboration in data projects through effective workspace management.
How to use
To use Databricks MCP, ensure you have Python 3.10 or higher, install Poetry, and configure Databricks authentication. Clone the repository, install dependencies using Poetry, and set up environment variables in a .env file for local development.
Key features
Key features of Databricks MCP include workspace management (notebooks, files, repos, secrets), compute management (clusters, SQL warehouses), data interaction (SQL execution, catalog browsing), AI/ML workflow management (MLflow, Model Serving, Vector Search), and job execution & management.
Where to use
Databricks MCP can be used in various fields such as data science, machine learning, and AI development, particularly where integration with Databricks services is required for enhanced data processing and model management.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Databricks MCP Server
This project implements a Model Context Protocol (MCP) server for interacting with Databricks using the databricks-sdk.
Overview
This server allows AI agents and other applications compatible with MCP to leverage Databricks functionalities, including:
- Workspace management (notebooks, files, repos, secrets)
- Compute management (clusters, SQL warehouses)
- Data interaction (SQL execution via warehouses, catalog browsing)
- AI/ML workflow management (MLflow, Model Serving, Vector Search)
- Job execution & management
Refer to the Product Requirements Document for original features and the Technical Architecture for design specifics.
For a detailed list of implemented tools and resources, see the Capabilities Document.
Setup
-
Prerequisites:
- Python >=3.10,<3.13 (as required by the
mcppackage) Poetry(>=1.2, recommend latest)- Access to a Databricks workspace
- Databricks authentication configured (e.g., via environment variables
DATABRICKS_HOSTandDATABRICKS_TOKEN, or other methods supported bydatabricks-sdk). See SDK Authentication.
- Python >=3.10,<3.13 (as required by the
-
Clone the repository:
git clone <repository-url> cd databricks-mcp-server -
Install dependencies:
poetry installThis will create a virtual environment (if one doesn’t exist) and install all dependencies specified in
pyproject.tomlandpoetry.lock. -
Activate virtual environment:
poetry shell
Configuration
The server is configured primarily through environment variables. Create a .env file in the project root by copying .env.example and filling in your values for local development:
cp .env.example .env
# Now edit .env
Required .env Variables:
DATABRICKS_HOST: Your Databricks workspace URL (e.g.,https://dbc-XXXX.cloud.databricks.com).DATABRICKS_TOKEN: Your Databricks Personal Access Token (or configure another auth method recognized by the SDK).
Optional .env Variables:
LOG_LEVEL: Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL). Defaults toINFO.ENABLE_GET_SECRET: Set totrueto enable thedatabricks:secrets:get_secrettool. Defaults tofalse. Use with extreme caution.
Usage
Make sure your virtual environment is activated (poetry shell) and your .env file is configured.
Run the server via stdio:
python -m src.databricks_mcp
An MCP client/host can then connect to this process via its standard input/output.
(Instructions for HTTP/SSE transport will be added if implemented).
Development
- Setup: Follow the steps in the Setup section. Ensure development dependencies are installed (they are by default with
poetry install). - Running Tests:
pytest - Linting/Formatting:
ruff check . ruff format . - Project Structure: See the Technical Architecture document.
- Adding Tools/Resources:
- Create or modify Python files under
src/databricks_mcp/tools/orsrc/databricks_mcp/resources/. - Implement the logic using the
mcpframework (@mcp.tool(),@mcp.resource()) and thedb_client.pywrapper. - Register the new capabilities in
src/databricks_mcp/server.py. - Add corresponding unit tests in the
tests/unit/directory.
- Create or modify Python files under
See the Implementation Plan for tracking development tasks.
The implementation based on the initial plan is now complete.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










