MCP ExplorerExplorer

Wolframalpha Llm Mcp

@Garothon a year ago
31 MIT
FreeCommunity
AI Systems
An MCP Server for WolframAlpha's LLM API, able to return structured knowledge & solve math

Overview

What is Wolframalpha Llm Mcp

WolframAlpha-llm-mcp is an MCP Server that provides access to WolframAlpha’s LLM API, enabling users to obtain structured knowledge and solve mathematical problems through natural language queries.

Use cases

Use cases for wolframalpha-llm-mcp include educational assistance for students in mathematics and science, automated data retrieval for research purposes, and enhancing chatbot capabilities with accurate information.

How to use

To use wolframalpha-llm-mcp, clone the repository, install the necessary dependencies, obtain a WolframAlpha API key, and configure it in your MCP settings file. You can then use commands like ‘ask_llm’ to query the API.

Key features

Key features include querying WolframAlpha’s LLM API with natural language, answering complex mathematical questions, retrieving facts across various domains such as science and history, and providing structured responses optimized for LLM consumption.

Where to use

undefined

Content

WolframAlpha LLM MCP Server

WolframAlpha LLM MCP Logo

A Model Context Protocol (MCP) server that provides access to WolframAlpha’s LLM API. https://products.wolframalpha.com/llm-api/documentation

WolframAlpha MCP Server Example 1

WolframAlpha MCP Server Example 2

Features

  • Query WolframAlpha’s LLM API with natural language questions
  • Answer complicated mathematical questions
  • Query facts about science, physics, history, geography, and more
  • Get structured responses optimized for LLM consumption
  • Support for simplified answers and detailed responses with sections

Available Tools

  • ask_llm: Ask WolframAlpha a question and get a structured llm-friendly response
  • get_simple_answer: Get a simplified answer
  • validate_key: Validate the WolframAlpha API key

Installation

git clone https://github.com/Garoth/wolframalpha-llm-mcp.git
npm install

Configuration

  1. Get your WolframAlpha API key from developer.wolframalpha.com

  2. Add it to your Cline MCP settings file inside VSCode’s settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):

{
  "mcpServers": {
    "wolframalpha": {
      "command": "node",
      "args": [
        "/path/to/wolframalpha-mcp-server/build/index.js"
      ],
      "env": {
        "WOLFRAM_LLM_APP_ID": "your-api-key-here"
      },
      "disabled": false,
      "autoApprove": [
        "ask_llm",
        "get_simple_answer",
        "validate_key"
      ]
    }
  }
}

Development

Setting Up Tests

The tests use real API calls to ensure accurate responses. To run the tests:

  1. Copy the example environment file:

    cp .env.example .env
    
  2. Edit .env and add your WolframAlpha API key:

    WOLFRAM_LLM_APP_ID=your-api-key-here
    

    Note: The .env file is gitignored to prevent committing sensitive information.

  3. Run the tests:

    npm test
    

Building

npm run build

License

MIT

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers