- Explore MCP Servers
- LFX_T2_WasmEdge_MCP_Pre_Test_Attempt
Lfx T2 Wasmedge Mcp Pre Test Attempt
What is Lfx T2 Wasmedge Mcp Pre Test Attempt
LFX_T2_WasmEdge_MCP_Pre_Test_Attempt is a pre-test code repository for the LFX Term 2 WasmEdge mentorship program, aimed at creating an MCP-based AI agent to assist in LF certification preparation.
Use cases
Use cases include creating AI agents that assist users in finding relevant questions for LF certification preparation, demonstrating MCP capabilities, and integrating external tools through the AI agent.
How to use
To use LFX_T2_WasmEdge_MCP_Pre_Test_Attempt, clone the repository and set up the environment variables for model switching. You can run the project locally using the Dockerized LlamaEdge server for tool-calling capabilities.
Key features
Key features include a modular design for easy addition of new MCP servers, semantic question search using advanced text embedding models, easy model switching via environment variables, and a custom dataset stored in a sqlite3 database.
Where to use
LFX_T2_WasmEdge_MCP_Pre_Test_Attempt can be used in educational and training environments, particularly for preparing for LF certification exams and developing AI agents that utilize MCP.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Lfx T2 Wasmedge Mcp Pre Test Attempt
LFX_T2_WasmEdge_MCP_Pre_Test_Attempt is a pre-test code repository for the LFX Term 2 WasmEdge mentorship program, aimed at creating an MCP-based AI agent to assist in LF certification preparation.
Use cases
Use cases include creating AI agents that assist users in finding relevant questions for LF certification preparation, demonstrating MCP capabilities, and integrating external tools through the AI agent.
How to use
To use LFX_T2_WasmEdge_MCP_Pre_Test_Attempt, clone the repository and set up the environment variables for model switching. You can run the project locally using the Dockerized LlamaEdge server for tool-calling capabilities.
Key features
Key features include a modular design for easy addition of new MCP servers, semantic question search using advanced text embedding models, easy model switching via environment variables, and a custom dataset stored in a sqlite3 database.
Where to use
LFX_T2_WasmEdge_MCP_Pre_Test_Attempt can be used in educational and training environments, particularly for preparing for LF certification exams and developing AI agents that utilize MCP.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Pre-test attempt for LFX mentorship (2025/term2) WasmEdge
About
This is a pre-test attempt for the LFX mentorship program, specifically for the “Create an MCP-based AI agent to help LF certificate preparation” [1].
Project Features
- Modular design: The project is designed to be modular, allowing for easy addition of new MCP servers and tools.
- Semantic question search: Rather than using traditional keyword search, the project uses semantic search to find relevant questions. This is done using the gemini’s
text-embedding-004
[2] text embedding model andsqlite3
db withsqlite_vec
[3] extension to store the vectors. - Easy model switching: The project allows for easy switching between different models (remote & local), simply by updating the environment variables. Supports
LlamaEdge API server
[4]. - Custom dataset: The project uses a custom dataset of questions and answers, which is stored in a sqlite3 database. The
jsnad_qna.csv
dataset is sample dataset created for OpenJS Node.js Application Developer (JSNAD) exam. It is copyright compliant as it is generated using frontier models.
Goal: To create an AI agent that can assist in preparing LF certificates using MCP (Model Context Protocol) [5].
Objective:
- To demonstrate the ability to create an AI agent capable of handling external tools via MCP.
- To demonstrate the understanding of topics like MCP, AI agents, and tool usage.
Dockerized LlamaEdge Server (Optional Local LLM)
For running a local LLM with tool-calling capabilities, a pre-built Docker image for the LlamaEdge API server featuring the Meta Llama 3.2 3B Instruct model is available on Docker Hub:
- Docker Hub Link: mayureshdev/llama-3.2-3b
You can easily pull and run this image to start an OpenAI-compatible chat API server locally or on any machine with Docker. This server supports tool calls, enabling integration with the MCP server.
To run the Dockerized server:
# Pull the image
docker pull mayureshdev/llama-3.2-3b:latest
# Run the container (ensure you have sufficient RAM, ~8GB+ recommended)
docker run -d -p 8080:8080 --name llamaedge-server-3b mayureshdev/llama-3.2-3b:latest
The API will then be available at http://localhost:8080
. You can configure the LLM_API_BASE_URL
in your .env.local
file to http://localhost:8080/v1
and LLM_MODEL
to Llama-3.2-3b
(or as defined by the server) to use this local LLM.
Detailed instructions for building this and other LlamaEdge Docker images (e.g., for Llama 3.1 8B) can be found in the llamaedge/README.md
directory.
Screenshots
Architecture Diagram
flowchart TD User([User]) Client([Client]) LLM([LLM API Server]) MCP([MCP Server]) DB[(SQLite DB<br/>+ Embeddings)] CSV[[Q&A Dataset]] User -- types query --> Client Client -- prompt & tool schema --> LLM LLM -- tool call --> Client Client -- tool call --> MCP MCP -- semantic search --> DB DB -- loads from --> CSV MCP -- tool result --> Client Client -- tool result --> LLM LLM -- answer --> Client Client -- displays --> User
How to run
It’s very simple to get things up and running. Just follow the steps below:
- Clone the repository:
git clone https://github.com/Mayuresh-22/LFX_T2_WasmEdge_MCP_Pre_Test_Attempt.git cd LFX_T2_WasmEdge_MCP_Pre_Test_Attempt
- Install the required dependencies:
uv venv .venv/Scripts/activate uv sync
- Update the environment variables:
Add your environment variables to the .env.local.example file and rename it to .env.local
- Run the application:
uv run client.py
- Follow the instructions in the terminal to interact with the AI agent.
- To stop the chat loop, type “exit” or “quit”.
Pre-test completion status
Task | Status |
---|---|
Create a LlamaEdge API server using a tool-call enabled open-source LLM | ✅ |
Create an MCP server that “searches” questions and answers from a text file | ✅ |
Create a simple Python application that: | |
• takes user input | ✅ |
• calls the LLM API server | ✅ |
• handles LLM tool calls using MCP | ✅ |
• sends LLM response back to the user | ✅ |
Explain your choice of test subjects and how you plan to get the source questions and answers (must be copyright compliant) | ✅ |
Original Work & Licensing
This project represents my original work and effort created for the LFX Mentorship (2025/Term2) WasmEdge pre-test. This work is licensed under the MIT License.
Dataset Attribution
The jsnad_qna.csv
dataset, used for the OpenJS Node.js Application Developer (JSNAD) exam preparation, was generated using frontier models. The generation process referenced the Node.js Official Documentation, which is available under the MIT License.
- Node.js Official Documentation: https://nodejs.org/api/
Thank You!
Thank you for taking the time to review my pre-test submission. I’m genuinely excited about the possibility of joining the mentorship program and eager to learn, grow, and contribute.
If you have any feedback or suggestions, I’d love to hear them!
Feel free to contact me via [email protected] or LinkedIn.
DevTools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.