MCP ExplorerExplorer

Tome

@runebookaion 19 days ago
241 Apache-2.0
FreeCommunity
AI Systems
#llm#llms#mcp#mcp-servers#ollama#qwen
A magical tool for using local LLMs with MCP servers.

Overview

What is Tome

Tome is a MacOS application designed to facilitate the use of local LLMs with MCP servers, developed by the Runebook team.

Use cases

Use cases for Tome include developing chatbots, experimenting with AI models, and integrating local LLMs into applications.

How to use

To use Tome, install the application along with Ollama, set up your MCP server, and start chatting with your MCP-powered model.

Key features

Key features of Tome include easy management of MCP servers, no need for manual configuration with uv/npm or json files, and quick setup for local LLMs.

Where to use

Tome is primarily used in environments where local LLMs and MCP servers are needed, such as AI development, research, and personal projects.

Content

Tome - Magical AI Spellbook

Tome

a magical desktop app that puts the power of LLMs and MCP in the hands of everyone

Join Us on Discord License: Apache 2.0 GitHub Release

🔮 Download the Tome Desktop App: Windows | MacOS

Tome

Tome is a desktop app that lets anyone harness the magic of LLMs and MCP. Download Tome, connect any local or remote LLM and hook it up to thousands of MCP servers to create your own magical AI-powered spellbook.

🫥 Want it to be 100% local, 100% private? Use Ollama and Qwen3 with only local MCP servers to cast spells in your own pocket universe. ⚡ Want state of the art cloud models with the latest remote MCP servers? You can have that too. It’s all up to you!

🏗️ This is a Technical Preview so bear in mind things will be rough around the edges. Join us on Discord to share tips, tricks, and issues you run into. Star this repo to stay on top of updates and feature releases!

🪄 Features

  • 🧙 Streamlined Beginner Friendly Experience
    • Simply download and install Tome and hook up the LLM of your choice
    • No fiddling with JSON, Docker, python or node
  • 🤖 AI Model Support
    • Remote: Google Gemini, OpenAI, any OpenAI API-compatible endpoint
    • Local: Ollama, LM Studio, Cortex, any OpenAI API-compatible endpoint
  • 🔮 Enhanced MCP support
    • UI to install, remove, turn on/off MCP servers
    • npm, uvx, node, python MCP servers all supported out of box
  • 🏪 Integration into Smithery.ai registry
    • Thousands of MCP servers available via one-click installation
  • ✏️ Customization of context windows and temperature
  • 🧰 Native support for tool calls and reasoning models
    • UI enhancements that clearly delineate tool calls and thinking messages

Demo

https://github.com/user-attachments/assets/0775d100-3eba-4219-9e2f-360a01f28cce

Getting Started

Requirements

Quickstart

  1. Install Tome
  2. Connect your preferred LLM provider - OpenAI, Ollama and Gemini are preset but you can also add providers like LM Studio by using http://localhost:1234/v1 as the URL
  3. Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste uvx mcp-server-fetch into the server field).
  4. Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.

Vision

We want to make local LLMs and MCP accessible to everyone. We’re building a tool that allows you to be creative with LLMs, regardless
of whether you’re an engineer, tinkerer, hobbyist, or anyone in between.

Core Principles

  • Tome is local first: You are in control of where your data goes.
  • Tome is for everyone: You shouldn’t have to manage programming languages, package managers, or json config files.

What’s Next

We’ve gotten a lot of amazing feedback in the last few weeks since releasing Tome but we’ve got big plans for the future. We want to break LLMs out of their chatbox, and we’ve got a lot of features coming to help y’all do that.

  • Scheduled tasks: LLMs should be doing helpful things even when you’re not in front of the computer.
  • Native integrations: MCP servers are a great way to access tools and information, but we want to add more powerful integrations to interact with LLMs in unique. ways
  • App builder: we believe long term that the best experiences will not be in a chat interface. We have plans to add additional tools that will enable you to create powerful applications and workflows.
  • ??? Let us know what you’d like to see! Join our community via the links below, we’d love to hear from you.

Community

Discord Blog Bluesky Twitter

Tools

No tools

Comments