MCP ExplorerExplorer

Mcp Perplexity Search

@spences10on 9 months ago
8 MIT
FreeCommunity
AI Systems
#mcp#model-context-protocol#perplexity#search
"🔎 A Model Context Protocol (MCP) server for integrating Perplexity's AI API with LLMs."

Overview

What is Mcp Perplexity Search

mcp-perplexity-search is a Model Context Protocol (MCP) server designed to integrate Perplexity’s AI API with large language models (LLMs), offering advanced chat completion capabilities with specialized prompt templates for various applications.

Use cases

Use cases for mcp-perplexity-search include generating technical documentation, analyzing security best practices, conducting code reviews, and creating structured API documentation.

How to use

To use mcp-perplexity-search, configure your MCP client with the required settings, including your Perplexity API key. You can then utilize the chat_completion API to generate responses based on predefined or custom prompt templates.

Key features

Key features include advanced chat completion using Perplexity’s AI models, predefined prompt templates for common scenarios, support for custom templates, multiple output formats (text, markdown, JSON), optional source URL inclusion, configurable model parameters, and support for various Perplexity models like Sonar and LLaMA.

Where to use

undefined

Content

mcp-perplexity-search


⚠️ Notice

This repository is no longer maintained.

The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.

Please use mcp-omnisearch instead.


A Model Context Protocol (MCP) server for integrating Perplexity’s AI
API with LLMs. This server provides advanced chat completion
capabilities with specialized prompt templates for various use cases.

Features

  • 🤖 Advanced chat completion using Perplexity’s AI models
  • 📝 Predefined prompt templates for common scenarios:
    • Technical documentation generation
    • Security best practices analysis
    • Code review and improvements
    • API documentation in structured format
  • 🎯 Custom template support for specialized use cases
  • 📊 Multiple output formats (text, markdown, JSON)
  • 🔍 Optional source URL inclusion in responses
  • ⚙️ Configurable model parameters (temperature, max tokens)
  • 🚀 Support for various Perplexity models including Sonar and LLaMA

Configuration

This server requires configuration through your MCP client. Here are
examples for different environments:

Cline Configuration

Add this to your Cline MCP settings:

{
  "mcpServers": {
    "mcp-perplexity-search": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-perplexity-search"
      ],
      "env": {
        "PERPLEXITY_API_KEY": "your-perplexity-api-key"
      }
    }
  }
}

Claude Desktop with WSL Configuration

For WSL environments, add this to your Claude Desktop configuration:

{
  "mcpServers": {
    "mcp-perplexity-search": {
      "command": "wsl.exe",
      "args": [
        "bash",
        "-c",
        "source ~/.nvm/nvm.sh && PERPLEXITY_API_KEY=your-perplexity-api-key /home/username/.nvm/versions/node/v20.12.1/bin/npx mcp-perplexity-search"
      ]
    }
  }
}

Environment Variables

The server requires the following environment variable:

  • PERPLEXITY_API_KEY: Your Perplexity API key (required)

API

The server implements a single MCP tool with configurable parameters:

chat_completion

Generate chat completions using the Perplexity API with support for
specialized prompt templates.

Parameters:

  • messages (array, required): Array of message objects with:
    • role (string): ‘system’, ‘user’, or ‘assistant’
    • content (string): The message content
  • prompt_template (string, optional): Predefined template to use:
    • technical_docs: Technical documentation with code examples
    • security_practices: Security implementation guidelines
    • code_review: Code analysis and improvements
    • api_docs: API documentation in JSON format
  • custom_template (object, optional): Custom prompt template with:
    • system (string): System message for assistant behaviour
    • format (string): Output format preference
    • include_sources (boolean): Whether to include sources
  • format (string, optional): ‘text’, ‘markdown’, or ‘json’ (default:
    ‘text’)
  • include_sources (boolean, optional): Include source URLs (default:
    false)
  • model (string, optional): Perplexity model to use (default:
    ‘sonar’)
  • temperature (number, optional): Output randomness (0-1, default:
    0.7)
  • max_tokens (number, optional): Maximum response length
    (default: 1024)

Development

Setup

  1. Clone the repository
  2. Install dependencies:
pnpm install
  1. Build the project:
pnpm build
  1. Run in development mode:
pnpm dev

Publishing

The project uses changesets for version management. To publish:

  1. Create a changeset:
pnpm changeset
  1. Version the package:
pnpm changeset version
  1. Publish to npm:
pnpm release

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see the LICENSE file for details.

Acknowledgments

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers