MCP ExplorerExplorer

Fegis

@p-funkon a year ago
16 MIT
FreeCommunity
AI Systems
FEGIS is a Model Context Protocol server that gives LLMs structured, persistent memory through customizable cognitive tools defined in your schema.

Overview

What is Fegis

Fegis is a semantic tool building framework and compiler that transforms YAML specifications, called Archetypes, into structured tools for large language models (LLMs). Utilizing the Model Context Protocol (MCP), it compiles Archetypes into schema-validated interfaces that guide content generation through semantic directives.

Use cases

Fegis can be used to create thinking frameworks that facilitate complex reasoning processes, web exploration interfaces for content curation and connection, optimization systems inspired by biological networks, and symbolic reasoning tools that utilize visual languages, such as emojis.

How to use

To use Fegis, install the required tools, clone the repository, and start the Qdrant server. Configure a JSON file with the necessary parameters including server commands and environment variables. After setup, you can define and invoke tools using YAML Archetypes to leverage LLM capabilities.

Key features

Key features of Fegis include the implementation of MCP for semantically rich tool definitions, a semantic programming framework that uses YAML structure to shape language model behavior, and a hybrid memory system combining vector embeddings with structured metadata to create a searchable knowledge graph.

Where to use

Fegis can be applied in contexts requiring enhanced communication and reasoning, such as educational platforms, research tools, content management systems, and any applications utilizing LLMs for generating structured, precise information.

Content

Fegis

Fegis does 3 things:

  1. Easy to write tools - Write prompts in YAML format. Fegis converts them into working MCP tools.
  2. Structured data from tool calls saved in a vector database - Every tool use is automatically stored in Qdrant with full context.
  3. Search - AI can search through all previous tool usage using semantic similarity, filters, or direct lookup.

Quick Start

# Install uv
# Windows
winget install --id=astral-sh.uv -e

# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# Clone
git clone https://github.com/p-funk/fegis.git

# Start Qdrant
docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant:latest

Configure Claude Desktop

Update claude_desktop_config.json:

{
  "mcpServers": {
    "fegis": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/fegis",
        "run",
        "fegis"
      ],
      "env": {
        "QDRANT_URL": "http://localhost:6333",
        "QDRANT_API_KEY": "",
        "COLLECTION_NAME": "fegis_memory",
        "EMBEDDING_MODEL": "BAAI/bge-small-en",
        "SPARSE_EMBEDDING_MODEL": "prithivida/Splade_PP_en_v1",
        "ARCHETYPE_PATH": "/absolute/path/to/fegis-wip/archetypes/default.yaml",
        "AGENT_ID": "claude_desktop"
      }
    }
  }
}

Restart Claude Desktop. You’ll have 7 new tools available including SearchMemory.

How It Works

1. Tools from YAML

parameters:
  BiasScope:
    description: "Range of bias detection to apply"
    examples: [confirmation, availability, anchoring, systematic, comprehensive]
  
  IntrospectionDepth:
    description: "How deeply to examine internal reasoning processes"
    examples: [surface, moderate, deep, exhaustive, meta_recursive]
    
tools:
  BiasDetector:
    description: "Identify reasoning blind spots, cognitive biases, and systematic errors in AI thinking patterns through structured self-examination"
    parameters:
      BiasScope:
      IntrospectionDepth:
    frames:
      identified_biases:
        type: List
        required: true
      reasoning_patterns:
        type: List
        required: true
      alternative_perspectives:
        type: List
        required: true

2. Automatic Memory Storage

Every tool invocation gets stored with:

  • Tool name and parameters used
  • Complete input and output
  • Timestamp and session context
  • Vector embeddings for semantic search

3. SearchMemory Tool

"Use SearchMemory and find my analysis of privacy concerns"
"Use SearchMemory and what creative ideas did I generate last week?"  
"Use SearchMemory and show me all UncertaintyNavigator results"
"Use SearchMemory and search for memories about decision-making"

Available Archetypes

  • archetypes/default.yaml - Cognitive analysis tools (UncertaintyNavigator, BiasDetector, etc.)
  • archetypes/simple_example.yaml - Basic example tools
  • archetypes/emoji_mind.yaml - Symbolic reasoning with emojis
  • archetypes/slime_mold.yaml - Network optimization tools
  • archetypes/vibe_surfer.yaml - Web exploration tools

Configuration

Required environment variables:

  • ARCHETYPE_PATH - Path to YAML archetype file
  • QDRANT_URL - Qdrant database URL (default: http://localhost:6333)

Optional environment variables:

  • COLLECTION_NAME - Qdrant collection name (default: fegis_memory)
  • AGENT_ID - Identifier for this agent (default: default-agent)
  • EMBEDDING_MODEL - Dense embedding model (default: BAAI/bge-small-en)
  • SPARSE_EMBEDDING_MODEL - Sparse embedding model (default: prithivida/Splade_PP_en_v1)
  • QDRANT_API_KEY - API key for remote Qdrant (default: empty)

Requirements

  • Python 3.13+
  • uv package manager
  • Docker (for Qdrant)
  • MCP-compatible client

License

MIT License - see LICENSE file for details.

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers