MCP ExplorerExplorer

Mcp Function Calling Adapter

@d-kimusonon a year ago
2 MIT
FreeCommunity
AI Systems
This library provides a function calling adapter for the Model Context Protocol (MCP).

Overview

What is Mcp Function Calling Adapter

The mcp-function-calling-adapter is a library that serves as a function calling adapter for the Model Context Protocol (MCP), allowing standardized MCP Server implementations to be used directly with function calling.

Use cases

Use cases include integrating AI models with real-time data queries (e.g., weather information), automating tasks through function calls, and enhancing conversational agents with external tool capabilities.

How to use

To use the mcp-function-calling-adapter, install it via npm with the command ‘pnpm add mcp-function-calling-adapter’. Then, create an instance of McpFunctionCallingAdapter, configure it with your desired settings, and start the servers to enable function calling with compatible APIs like OpenAI.

Key features

Key features include seamless integration with MCP Servers, support for function calling, and the ability to execute tools registered within the adapter, enhancing the interaction between models and external functions.

Where to use

The mcp-function-calling-adapter can be used in various fields such as AI development, chatbots, and any application that requires interaction with external functions through the Model Context Protocol.

Content

mcp-function-calling-adapter

English | 日本語

npm version
License: MIT
Check

This library provides a function calling adapter for the Model Context Protocol (MCP).

You can use standardized MCP Server implementations directly with function calling through this library.

Installation

pnpm add mcp-function-calling-adapter

Usage

Here’s an example using it with the OpenAI API:

import { McpFunctionCallingAdapter } from "mcp-function-calling-adapter"

const adapter = new McpFunctionCallingAdapter("example", {
  "sequential-thinking": {
    command: "npx",
    args: ["-y", "@modelcontextprotocol/server-sequential-thinking"],
  },
})

const openai = new OpenAI({
  apiKey: "your api key here",
})

try {
  await adapter.startServers()

  const messages = [
    {
      role: "user",
      content: "What's the weather like in Paris today?",
    },
  ]
  const completion = await openai.chat.completions.create({
    model: "gpt-4o-mini",
    messages,
    tools: adapter.getTools().map((tool) => ({
      type: "function",
      function: {
        name: tool.name,
        description: tool.description ?? "",
        parameters: tool.inputSchema,
      },
    })),
    tool_choice: "auto",
  })

  // Call tool
  const toolCall = completion.choices.at(0)?.message.tool_calls?.at(0)
  if (toolCall && adapter.isRegisteredTool(toolCall.function.name)) {
    const response = await adapter.executeTool(
      toolCall.function.name,
      JSON.parse(toolCall.function.arguments)
    )
    messages.push(completion.choices[0].message)
    messages.push({
      role: "tool",
      tool_call_id: toolCall.id,
      content: response.content.toString(),
    })
  }
} finally {
  await adapter.clean()
}

API

startServers()

Starts the MCP servers and loads available tools.

clean()

Disconnects from the MCP servers.

getTools()

Returns a list of available tools.
The response includes the tool name, description, and input JSONSchema, which can be directly used for function calling schema definitions.

isRegisteredTool(name: string)

Checks if a tool with the specified name is registered and available for use.

  • name: The name of the tool to check
  • Returns: boolean - true if the tool is registered, false otherwise

executeTool(name: string, args: Record<string, unknown>)

Executes a tool with the specified name and returns the response from the MCP Server.

  • name: The name of the tool to execute
  • args: Arguments to pass to the tool

Contributing

We welcome contributions! Please see our Contributing Guidelines for details on how to submit pull requests, report issues, and contribute to the project.

Development

  1. Clone the repository
  2. Install dependencies: pnpm install
  3. Run tests: pnpm test
  4. Build: pnpm build

Changelog

See CHANGELOG.md for a list of changes and version history.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers