- Explore MCP Servers
- txyz-search
TXYZ Search
What is TXYZ Search
MCP (Model Context Protocol) is an open protocol that standardizes how applications connect large language models (LLMs) with external data sources and tools. It simplifies integration by providing a universal method for AI applications to access various resources, similar to how USB-C connects devices to peripherals.
Use cases
MCP serves various scientific research applications, allowing AI assistants to interact with multiple data sources such as the Materials Project database, execute Python code securely, perform SSH commands on remote systems, and fetch web content. Itβs designed to enhance the capabilities of AI models by streamlining workflows and access to scientific tools.
How to use
To utilize MCP servers, users can integrate them into compatible LLM client applications using the MCPM (MCP Manager). After selecting the desired client and server, users can install the server and validate its functionality through designated commands. This integration allows LLMs to effectively fetch data or execute tasks as specified.
Key features
MCP offers several key features including pre-built integrations for various applications, flexibility in switching LLM vendors, and best practices for data security. It also supports a growing number of specialized servers tailored for specific domains like materials science, code execution, and web fetching.
Where to use
MCP servers can be employed in diverse settings such as academic research, data analysis, computational tasks, and any field requiring the integration of LLMs with external data or tools. They are particularly useful in environments where scientific discovery and analysis need to be accelerated through AI.
Overview
What is TXYZ Search
MCP (Model Context Protocol) is an open protocol that standardizes how applications connect large language models (LLMs) with external data sources and tools. It simplifies integration by providing a universal method for AI applications to access various resources, similar to how USB-C connects devices to peripherals.
Use cases
MCP serves various scientific research applications, allowing AI assistants to interact with multiple data sources such as the Materials Project database, execute Python code securely, perform SSH commands on remote systems, and fetch web content. Itβs designed to enhance the capabilities of AI models by streamlining workflows and access to scientific tools.
How to use
To utilize MCP servers, users can integrate them into compatible LLM client applications using the MCPM (MCP Manager). After selecting the desired client and server, users can install the server and validate its functionality through designated commands. This integration allows LLMs to effectively fetch data or execute tasks as specified.
Key features
MCP offers several key features including pre-built integrations for various applications, flexibility in switching LLM vendors, and best practices for data security. It also supports a growing number of specialized servers tailored for specific domains like materials science, code execution, and web fetching.
Where to use
MCP servers can be employed in diverse settings such as academic research, data analysis, computational tasks, and any field requiring the integration of LLMs with external data or tools. They are particularly useful in environments where scientific discovery and analysis need to be accelerated through AI.
Content
MCP.science: Open Source MCP Servers for Scientific Research ππ
Join us in accelerating scientific discovery with AI and open-source tools!
Table of Contents
- About
- What is MCP?
- Available servers in this repo
- How to integrate MCP servers into LLM
- How to build your own MCP server
- Contributing
- License
- Acknowledgments
- Citation
About
This repository contains a collection of open source MCP servers specifically designed for scientific research applications. These servers enable Al models (like Claude) to interact with scientific data, tools, and resources through a standardized protocol.
What is MCP?
MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides:
- A growing list of pre-built integrations that your LLM can directly plug into
- The flexibility to switch between LLM providers and vendors
- Best practices for securing your data within your infrastructure
Available servers in this repo
Example Server
A example mcp server that help understand how mcp server works.
Materials Project
A specialized mcp server that enables Al assistants to search, visualize, and manipulate materials science data from the Materials Project database. A Materials Project API key is required.
Python Code Execution
A secure sandboxed environment that allows AI assistants to execute Python code snippets with controlled access to standard library modules, enabling data analysis and computation tasks without security risks.
SSH Exec
A specialized mcp server that enables AI assistants to securely run validated commands on remote systems via SSH, with configurable restrictions and authentication options.
Web Fetch
A versatile mcp server that allows AI assistants to fetch and process HTML, PDF, and plain text content from websites, enabling information gathering from online sources.
TXYZ Search
A specialized mcp server that enables AI assistants to perform academic and scholarly searches, general web searches, or automatically select the best search type based on the query. A TXYZ API key is required.
How to integrate MCP servers into LLM
If youβre not familiar with these stuff, here is a step-by-step guide for you: Step-by-step guide to integrate MCP servers into LLM
Prerequisites
- MCPM: a MCP manager developed by us, which is easy to use, open source, community-driven, forever free.
- uv: An extremely fast Python package and project manager, written in Rust. You can install it by running:
curl -sSf https://astral.sh/uv/install.sh | bash
- MCP client: e.g. Claude Desktop / Cursor / Windsurf / Chatwise / Cherry Studio
Integrate MCP servers into your client
MCP servers can be integrated with any compatible client application. Here, weβll walk through the integration process using the web-fetch
mcp server (included in this repository) as an example.
Client Integration
With MCPM, you can easily integrate MCP servers into your client application.
Before installing the server, you need to specify the client you want to add the server to.
list available clients:
mcpm client ls
specify the client you want to add the server to:
mcpm client set <client-name>
then add the server:
mcpm add web-fetch
You may need to restart your client application for the changes to take effect.
Then you can validate whether the integration installed successfully by asking LLM to fetch web content:
- βCan you fetch and summarize the content from https://modelcontextprotocol.io/?β
- The
web-fetch
tool should be called and the content should be retrieved.
Find other servers
We would recommend you to check Available Servers in this repo or MCPM Registry for more servers.
How to build your own MCP server
Please check How to build your own MCP server step by step for more details.
Contributing
We enthusiastically welcome contributions to MCP.science! You can help with improving the existing servers, adding new servers, or anything that you think will make this project better.
If you are not familiar with GitHub and how to contribute to a open source repository, then it might be a bit of challenging, but itβs still easy for you. We would recommend you to read these first:
In short, you can follow these steps:
-
Fork the repository to your own GitHub account
-
Clone the forked repository to your local machine
-
Create a feature branch (
git checkout -b feature/amazing-feature
) -
Make your changes and commit them (
git commit -m 'Add amazing feature'
)π Click to see more conventions about directory and naming
Please create your new server in the
servers
folder.
For creating a new server folder under repository folder, you can simply run (replaceyour-new-server
with your server name)uv init --package --no-workspace servers/your-new-server uv add --directory servers/your-new-server mcp
This will create a new server folder with the necessary files:
servers/your-new-server/ βββ README.md βββ pyproject.toml βββ src βββ your_new_server βββ __init__.py
You may find there are 2 related names you might see in the config files:
- Project name (hyphenated): The folder, project name and script name in
pyproject.toml
, e.g.your-new-server
. - Python package name (snake_case): The folder inside
src/
, e.g.your_new_server
.
- Project name (hyphenated): The folder, project name and script name in
-
Push to the branch (
git push origin feature/amazing-feature
) -
Open a Pull Request
Please make sure your PR adheres to:
- Clear commit messages
- Proper documentation updates
- Test coverage for new features
Contributor Recognition in Subrepos
If you want to recognize contributors for a specific server/subrepo (e.g. servers/gpaw-computation/
), you can use the All Contributors CLI in that subdirectory.
Steps:
- In your subrepo (e.g.
servers/gpaw-computation/
), create a.all-contributorsrc
file (see example). - Add contributors using the CLI:
npx all-contributors add <github-username> <contribution-type>
- Generate or update the contributors section in the subrepoβs
README.md
:npx all-contributors generate
- Commit the changes to the subrepoβs
README.md
and.all-contributorsrc
.
For more details, see the All Contributors CLI installation guide.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
Thanks to all contributors!
Citation
For general use, please cite this repository as described in the root CITATION.cff.
If you use a specific server/subproject, please see the corresponding CITATION.cff
file in that subprojectβs folder under servers/
for the appropriate citation.