- Explore MCP Servers
- botanika-desktop
Botanika Desktop
What is Botanika Desktop
Botanika-desktop is a local client for large language models (LLMs) and associated tools, designed to operate with MCP support. It ensures that all data is stored locally, allowing users to maintain control over their information and privacy.
Use cases
Use cases for botanika-desktop include developing applications that require natural language processing, creating content with AI assistance, conducting research that involves data analysis, and providing personalized educational tools.
How to use
To use botanika-desktop, users need to set their environment variables in the ‘.env’ file, install the necessary packages using ‘npm install’, and run the application with ‘npm run dev’. Users must also provide their own API keys for LLM providers like OpenAI or Groq.
Key features
Key features of botanika-desktop include local data storage, support for multiple LLM providers, a desktop application interface, and compatibility with MCP. It supports Text-to-Speech (TTS) functionality but does not support Speech-to-Text (STT).
Where to use
Botanika-desktop can be used in various fields such as software development, content creation, education, and research, where local data processing and privacy are paramount.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Botanika Desktop
Botanika-desktop is a local client for large language models (LLMs) and associated tools, designed to operate with MCP support. It ensures that all data is stored locally, allowing users to maintain control over their information and privacy.
Use cases
Use cases for botanika-desktop include developing applications that require natural language processing, creating content with AI assistance, conducting research that involves data analysis, and providing personalized educational tools.
How to use
To use botanika-desktop, users need to set their environment variables in the ‘.env’ file, install the necessary packages using ‘npm install’, and run the application with ‘npm run dev’. Users must also provide their own API keys for LLM providers like OpenAI or Groq.
Key features
Key features of botanika-desktop include local data storage, support for multiple LLM providers, a desktop application interface, and compatibility with MCP. It supports Text-to-Speech (TTS) functionality but does not support Speech-to-Text (STT).
Where to use
Botanika-desktop can be used in various fields such as software development, content creation, education, and research, where local data processing and privacy are paramount.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Botanika
A local LLM + tooling (with MCP support) client. All data is stored locally. Bring your own API keys.
Client Features
| Support | TTS | STT | Open source | MCP Support | Desktop App | Web App |
|---|---|---|---|---|---|---|
| Botanika | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
| ChatGPT | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
| Copilot | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
| Claude | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
| T3.Chat | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
Native integrations
If you want to use any of these integrations, add them on the “Settings” page.
| Integration name | MCP Server URL |
|---|---|
| Google Search | http://localhost:48678/mcp/sse/google/search |
| Spotify | http://localhost:48678/mcp/sse/spotify |
Supported LLM providers
| Provider | Notes | API key link | Environment variable |
|---|---|---|---|
| OpenAI | OpenAI | OPENAI_API_KEY | |
| Groq | Groq | GROQ_API_KEY | |
| OpenRouter | OpenRouter | OPENROUTER_API_KEY | |
| Azure | AZURE_RESOURCE_NAME, AZURE_API_KEY | ||
| Ollama | Might not work | OLLAMA_URL |
Transcription
If you don’t want to use OpenAI, you can use Whisper locally. This requires a bit of setup:
Install pnpm, then run the following command and wait until the model is downloaded:
pnpm whisper-tnode download --model large-v1
Screenshots
Run
You can set your environment variables in the .env file or through the “Settings” page.
npm run setup
npm install
npm run dev
LLM provider
An LLM provider is used to generate most responses.
| Provider name | ENV variable | API key link |
|---|---|---|
| OpenAI | OPENAI_API_KEY | OpenAI |
| Groq | GROQ_API_KEY | Groq |
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










