MCP ExplorerExplorer

Science Mcps

@globus-labson 4 days ago
3 MIT
FreeCommunity
AI Systems
Globus MCP Servers

Overview

What is Science Mcps

This repository contains Model Context Protocol (MCP) servers that facilitate interactions between AI assistants and scientific computing resources, including data transfer and compute functions. The servers support various scientific infrastructures like Globus, ALCF, NERSC, and Diaspora Event Fabric, enabling streamlined access to data and computational capabilities.

Use cases

Users can interact with the systems to perform tasks such as transferring files between Globus endpoints, executing Python functions on remote compute resources, monitoring job statuses on supercomputing facilities like Polaris at ALCF and systems at NERSC, and managing topics and events with the Diaspora framework.

How to use

To use the MCP servers, users need to install the required dependencies for the specific server they wish to utilize. This involves cloning the repository, creating a Python environment, and installing necessary packages. After setting up, users can configure the MCP servers in the Claude Desktop application and issue commands to perform various operations like file transfers and job monitoring.

Key features

Key features include the ability to perform Globus file transfers and execute remote Python functions, check system statuses of supercomputing facilities, and manage event topics within the Diaspora framework. Each server comes with a set of tools tailored to support specific functionalities, ensuring efficient use of scientific computing resources.

Where to use

These MCP servers can be used in scientific and research environments where data transfer and computational tasks are frequent. They are particularly beneficial in high-performance computing settings where access to resources like ALCF and NERSC is necessary, as well as in scenarios where event-driven architectures like Diaspora are applied.

Content

Science MCPs

A collection of Model Context Protocol (MCP) servers that enable Claude and other AI assistants to interact with scientific computing resources and data management services.

Overview

This repository contains MCP servers that allow AI assistants to interact with scientific computing infrastructure:

  1. Globus MCP Servers - Enable interaction with Globus services for data transfer and compute functions
  2. Compute Facility MCP Servers - Enable interaction with ALCF and NERSC supercomputing facilities
  3. Diaspora MCP Server - Enables interaction with the Diaspora Event Fabric (Octopus) for topic management and event streaming.

These servers implement the Model Context Protocol (MCP), which allows AI assistants like Claude to interact with external tools and services.

Components

Globus MCP Servers

The Globus MCP servers enable AI assistants to:

  • Globus Transfer - Transfer files between Globus endpoints, browse directories, and manage transfer tasks
  • Globus Compute - Register and execute Python functions on remote Globus Compute endpoints (formerly FuncX)

Learn more about Globus MCP Servers

Compute Facility MCP Servers

The Compute Facility MCP servers enable AI assistants to:

  • ALCF - Check status of ALCF machines (e.g., Polaris) and monitor running jobs
  • NERSC - Check status of NERSC systems and services

Learn more about Compute Facility MCP Servers

Diaspora MCP Server

The Diaspora MCP server enable AI assistants to:

  • Manage topics - Create, list, and delete topics within the user’s namespace
  • Stream events - Publish events to a topic and retrieve the most recent event

Learn more about the Diaspora MCP Server

Use hosted MCPs (recommended)

Connecting to our hosted MCP servers is the fastest way to get started—no local installation or maintenance required.

  1. Open Claude Desktop and go to Settings → Developers.
  2. Click Edit Config and paste the hosted MCPs configuration.
  3. Restart Claude Desktop.

Deploy Locally

See local deployment configuration.

Usage Examples

Globus Transfer

You can ask Claude to:

Transfer files from my Globus endpoint to another endpoint

Globus Compute

You can ask Claude to:

Run a Python function on a Globus Compute endpoint

ALCF Status

You can ask Claude to:

Check if Polaris is online

NERSC Status

You can ask Claude to:

Check the status of NERSC systems

Diaspora Event Fabric

You can ask Claude to:

Register a Diaspora topic, produce three messages, and consume the latest message

Available Tools

Globus Transfer Server Tools

  • globus_authenticate - Start Globus authentication
  • complete_globus_auth - Complete authentication with an auth code
  • list_endpoints - List available Globus endpoints
  • submit_transfer - Submit a file transfer between endpoints
  • And more…

Globus Compute Server Tools

  • compute_authenticate - Start Globus Compute authentication
  • register_function - Register a Python function with Globus Compute
  • execute_function - Run a registered function on an endpoint
  • And more…

ALCF Server Tools

  • check_alcf_status - Get the status of the Polaris machine
  • get_running_jobs - Return the list of running jobs
  • system_health_summary - Summarize the jobs submitted to Polaris

NERSC Server Tools

  • get_nersc_status - Get the status of various NERSC services
  • check_system_availability - Check the system’s current availability
  • get_maintenance_info - Check the maintenance schedule of the resources

Diaspora Event Fabric Tools

  • register_topic – create a new Kafka topic
  • produce_event – publish a UTF‑8 message to a topic
  • consume_latest_event – fetch the most recent event from a topic
  • And more…

For a complete list of tools, see the README files for each component.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Tools

No tools

Comments