MCP ExplorerExplorer

Spring Ai Java Docker Mcp Rag

@shelajevon 10 months ago
2 MIT-0
FreeCommunity
AI Systems
Sample Spring Boot Application using Docker AI Model runner to help adopt dogs! Even neurotic ones.

Overview

What is Spring Ai Java Docker Mcp Rag

spring-ai-java-docker-mcp-rag is a sample Spring Boot application that utilizes Docker to run AI models, specifically designed to assist in dog adoptions, including those with behavioral issues.

Use cases

Use cases include managing dog adoption inquiries, scheduling appointments for potential adopters, and providing conversational AI support to users seeking information about available dogs.

How to use

To use spring-ai-java-docker-mcp-rag, deploy the application using Docker, configure the PostgreSQL database for vector storage, and interact with the adoptions and scheduling services through the provided API endpoints.

Key features

Key features include AI/ML capabilities via Docker Model Runner, conversation management with Spring AI, PostgreSQL with pgvector for efficient vector storage, and two dedicated services for handling dog adoption inquiries and scheduling appointments.

Where to use

spring-ai-java-docker-mcp-rag can be used in the animal adoption sector, particularly in organizations or shelters that aim to enhance their adoption processes through AI-driven solutions.

Content

Sample: Spring AI with Docker Model Runner and MCP

A Spring Boot application that provides an AI-powered dog adoption service using:

  • Docker Model Runner for AI/ML capabilities
  • Spring AI for conversation management
  • PostgreSQL with pgvector for vector storage
  • Two services:
    • Adoptions service: Handles dog adoption inquiries
    • Scheduling service: MCP Server that manages adoption appointments

Note: This is a fork of the sample app in AWS Samples

Architecture

sequenceDiagram
    actor User
    participant Controller as ConversationalController
    participant Memory as ChatMemory
    participant RAG as QuestionAnswerAdvisor
    participant Vector as VectorStore
    participant Chat as ChatClient
    participant MCP as MCPSyncClient
    participant AI as Docker Model Runner 

    User->>Controller: POST /{id}/inquire
    
    alt New conversation
        Controller->>Memory: computeIfAbsent(id)
        Memory-->>Controller: Create new PromptChatMemoryAdvisor
    end

    par RAG Process
        Controller->>RAG: Process question
        RAG->>Vector: Search relevant context
        Vector-->>RAG: Return matching embeddings
        RAG-->>Controller: Return augmented prompt
    and Memory Management
        Controller->>Memory: Get conversation history
        Memory-->>Controller: Return chat context
    end

    Controller->>Chat: prompt().user(question)
    
    Chat->>MCP: Synchronous tool callback
    MCP-->>Chat: Return tool results
    
    Chat->>AI: Send augmented prompt + context
    AI-->>Chat: Generate response
    
    Chat-->>Controller: Return content
    Controller->>Memory: Store conversation
    Controller-->>User: Return response

    Note over RAG,Vector: Retrieval Augmented Generation
    Note over Memory: Maintains conversation state
    Note over MCP: Handles scheduled operations

Setup

To run locally you will need:

  • JDK 23 or higher
  • Docker Desktop with Model Runner enabled
  1. Pull the models used in this sample application into Docker Model Runner:
docker model pull ai/mxbai-embed-large
docker model pull ai/qwen2.5:7B-Q4_K_M

Build the Scheduling MCP Server as a Docker container:

cd scheduling && ./mvnw spring-boot:build-image && cd ..

Running

This sample includes tests and a “test” main application which will start the dependency services (postgres with pgvector and the scheduling MCP server) in Docker with Testcontainers.

First make sure you are in the adoptions directory:

cd adoptions

Run the tests:

./mvnw test

Run the “adoptions” server:

./mvnw spring-boot:test-run

With the server started you can now make requests to the server.
In IntelliJ, open the resources/client.http file and run the two requests.
Or via curl:

curl -X POST --location "http://localhost:8080/2/inquire" \
    -H "Content-Type: application/x-www-form-urlencoded" \
    -d 'question=Do you have any neurotic dogs?'
curl -X POST --location "http://localhost:8080/2/inquire" \
    -H "Content-Type: application/x-www-form-urlencoded" \
    -d 'question=fantastic. when could i schedule an appointment to adopt Prancer, from the London location?'

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers