- Explore MCP Servers
- bibble
Bibble
What is Bibble
Bibble is a command-line interface (CLI) chatbot application that serves as an MCP (Model Context Protocol) client. It allows users to interact with AI language models directly from the terminal, featuring real-time streaming and chat history.
Use cases
Use cases for Bibble include creating interactive chatbots for websites, providing real-time assistance in applications, and facilitating educational tools that require conversational AI.
How to use
To use Bibble, install it via npm with the command ‘npm install -g @pinkpixel/bibble’. After installation, start a chat session by running ‘bibble’ or ‘npx @pinkpixel/bibble’. Configure settings through the CLI as needed.
Key features
Key features of Bibble include support for OpenAI and OpenAI-compatible APIs, real-time response streaming, chat memory for contextual conversations, detailed error handling, and customizable system prompts.
Where to use
Bibble can be used in various fields such as customer support, education, and software development, where interactive AI conversations are beneficial.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Bibble
Bibble is a command-line interface (CLI) chatbot application that serves as an MCP (Model Context Protocol) client. It allows users to interact with AI language models directly from the terminal, featuring real-time streaming and chat history.
Use cases
Use cases for Bibble include creating interactive chatbots for websites, providing real-time assistance in applications, and facilitating educational tools that require conversational AI.
How to use
To use Bibble, install it via npm with the command ‘npm install -g @pinkpixel/bibble’. After installation, start a chat session by running ‘bibble’ or ‘npx @pinkpixel/bibble’. Configure settings through the CLI as needed.
Key features
Key features of Bibble include support for OpenAI and OpenAI-compatible APIs, real-time response streaming, chat memory for contextual conversations, detailed error handling, and customizable system prompts.
Where to use
Bibble can be used in various fields such as customer support, education, and software development, where interactive AI conversations are beneficial.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Bibble - CLI Chatbot with MCP Integration
Bibble is a command-line interface (CLI) chatbot application built in TypeScript that runs directly in your terminal. It supports OpenAI, Anthropic, Google Gemini, and OpenAI-compatible API endpoints, implements real-time response streaming, maintains chat memory, and functions as an MCP (Model Context Protocol) client.
Features
- Launch as a chat instance via the CLI command
bibble
- Support for OpenAI, Anthropic (Claude models), Google Gemini, and OpenAI-compatible API endpoints
- Real-time response streaming of model output
- Contextual multi-turn conversations with chat memory
- MCP client functionality for connecting to MCP-compatible servers
- Settings and configuration options accessible from the CLI
- Detailed error handling and user feedback
- Colored text output and markdown rendering
- Chat history storage and retrieval
- Model switching capabilities
- Configurable system prompts and user guidelines
Installation
Prerequisites
- Node.js v16 or higher
- npm v7 or higher
Install from npm
# Install the official package
npm install -g @pinkpixel/bibble
Install from source
- Clone the repository
- Install dependencies
npm install
- Build the project
npm run build
- Install globally
npm install -g .
Usage
After installation, you can run Bibble using the command bibble
. If you installed the package with the @pinkpixel
scope, you can also use npx @pinkpixel/bibble
.
Start a chat
bibble
or
bibble chat
With npx:
npx @pinkpixel/bibble
Configure settings
bibble config
Manage chat history
bibble history
Commands
Chat commands
bibble chat
- Start a chat sessionbibble chat --model gpt-4
- Start a chat with a specific modelbibble chat --continue
- Continue the most recent chatbibble chat --history <id>
- Load a specific chat history
Config commands
bibble config list
- List all configuration settingsbibble config set <key> <value>
- Set a configuration valuebibble config get <key>
- Get a configuration valuebibble config reset
- Reset configuration to defaultsbibble config api-key
- Set up API key for a providerbibble config mcp-servers
- Manage MCP server configurationsbibble config user-guidelines
- Configure user guidelines
History commands
bibble history list
- List chat historybibble history show <id>
- Show a specific chat historybibble history delete <id>
- Delete a chat historybibble history clear
- Clear all chat historybibble history export <id> <filename>
- Export chat history to a JSON filebibble history import <filename>
- Import chat history from a JSON file
In-chat commands
The following commands are available during a chat session:
/help
- Display help information/exit
or/quit
- Exit the chat/clear
- Clear the screen/save
- Save the current chat to history/reset
- Reset the current conversation
Configuration
Bibble stores its configuration in a .bibble
directory in your home directory. The configuration includes:
- API keys
- Default model settings
- UI preferences
- MCP server configurations
- User guidelines (additional instructions for the AI)
MCP Integration
Bibble functions as an MCP client, allowing it to connect to MCP-compatible servers and use their tools. MCP (Model Context Protocol) is a protocol for connecting language models to external tools and services.
To configure MCP servers, use:
bibble config mcp-servers
Development
Project structure
/ ├── src/ │ ├── commands/ # CLI command handlers │ ├── config/ # Configuration management │ ├── mcp/ # MCP client implementation │ ├── llm/ # LLM integration │ ├── ui/ # Terminal UI components │ ├── utils/ # Utility functions │ ├── index.ts # Main entry point │ └── types.ts # TypeScript type definitions ├── bin/ # Binary executable ├── scripts/ # Helper scripts ├── package.json # NPM package definition └── tsconfig.json # TypeScript configuration
Build the project
npm run build
Development mode with watch
npm run dev
Publishing to npm
The package is published to npm under the @pinkpixel
scope:
# Login to npm
npm login
# Build the project
npm run build
# Publish the package
npm publish --access public
To install the latest version:
npm install -g @pinkpixel/bibble@latest
License
ISC
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.