MCP ExplorerExplorer

Deepco

@succlz123on a year ago
29 MIT
FreeCommunity
AI Systems
#ai#anthropic#claude#compose#coze#deepseek#dify#doubao#gemini#llm#mcp#mcp-client#mcp-server#ollama#openai#openrouter#qwen#sillytavern#tts
A Chat Client for LLM, written in Compose Multiplatform.

Overview

What is Deepco

Deep-Co is a chat client designed for interacting with various language models (LLMs) across multiple platforms, including desktop and mobile. It is built using Compose Multiplatform, enabling it to support a diverse range of API providers such as OpenRouter, Anthropic, Grok, OpenAI, and more. The application allows users to configure their own LLMs and has features for managing chat interactions and prompts.

Use cases

Deep-Co can be used for various applications that require interaction with LLMs, such as customer support, content generation, and personal assistance. Users can engage in real-time chat, manage conversation history, and customize prompts. The support for multiple LLMs allows users to choose the model that best fits their needs, making it versatile for different use cases.

How to use

To use Deep-Co, users need to configure their preferred LLM API key in the application settings. Once set up, they can begin chatting with the selected model. Users can manage prompts, engage with characters from SillyTavern, and customize themes according to their preferences. The application also includes options for exporting chats and integrating with MCP servers.

Key features

Key features of Deep-Co include support for multiple desktop platforms, chat functionality (stream and complete), prompt management, character adaptations from SillyTavern, integration with various LLMs like DeepSeek and Google Gemini, Text-to-Speech capabilities, and internationalization support for different languages and themes.

Where to use

Deep-Co can be used on various desktop operating systems including Windows, MacOS, and Linux. It also aims to support mobile platforms like Android and iOS in future updates. The application’s versatility allows it to be utilized in personal projects, educational environments, or professional settings where interacting with LLMs is beneficial.

Content

Deep-Co

icon

windows macos linux
android iOS
kotlin compose
stars gpl release

A Chat Client for LLMs, written in Compose Multiplatform. Target supports API providers such as OpenRouter, Anthropic, Grok, OpenAI, DeepSeek,
Coze, Dify, Google Gemini, etc. You can also configure any OpenAI-compatible API or use native models via LM Studio/Ollama.

Release

v1.0.6

Feature

  • [x] Desktop Platform Support(Windows/MacOS/Linux)
  • [ ] Mobile Platform Support(Android/iOS)
  • [x] Chat(Stream&Complete) / Chat History
  • [ ] Chat Messages Export / Chat Translate Server
  • [x] Prompt Management / User Define
  • [x] SillyTavern Character Adaptation(PNG&JSON)
  • [x] DeepSeek LLM / Grok LLM / Google Gemini LLM
  • [ ] Claude LLM / OpenAI LLM / OLLama LLM
  • [ ] Online API polling
  • [x] MCP Support
  • [ ] MCP Server Market
  • [ ] RAG
  • [x] TTS(Edge API)
  • [x] i18n(Chinese/English) / App Color Theme / App Dark&Light Theme

Chat With LLMs

1

Config Your LLMs API Key

2

Prompt Management

4

Chat With Tavern Character

6

User Management

5

Config MCP Servers

3

Setting

7

Model Context Protocol (MCP) ENV

MacOS

brew install uv
brew install node

windows

winget install --id=astral-sh.uv  -e
winget install OpenJS.NodeJS.LTS

Build

Run desktop via Gradle

./gradlew :desktopApp:run

Building desktop distribution

./gradlew :desktop:packageDistributionForCurrentOS
# outputs are written to desktopApp/build/compose/binaries

Run Android via Gradle

./gradlew :androidApp:installDebug

Building Android distribution

./gradlew clean :androidApp:assembleRelease
# outputs are written to androidApp/build/outputs/apk/release

Thanks

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers