- Explore MCP Servers
- sf-mcp-server
Sf Mcp Server
What is Sf Mcp Server
sf-mcp-server is an MCP Server designed for Open AI Agents to interact seamlessly with the Salesforce.com platform, facilitating end-to-end communication between a Salesforce organization and a large language model (LLM) agent using the Model Context Protocol (MCP).
Use cases
Use cases for sf-mcp-server include automating customer inquiries through AI-driven chatbots, generating reports from Salesforce data, and enhancing user interactions by leveraging AI capabilities for personalized services.
How to use
To use sf-mcp-server, set up a Salesforce Connected App, configure the necessary environment variables, and run the server using Python. The server exposes core Salesforce REST operations as MCP tools, which can be accessed by the GPT-4-powered agent to perform various tasks.
Key features
Key features of sf-mcp-server include: 1) FastMCP server implementation for efficient communication; 2) Integration with Salesforce for data read/write operations; 3) Autodiscovery of MCP tools by the GPT-4 agent; 4) Support for OAuth 2.0 for secure authentication.
Where to use
sf-mcp-server can be used in various fields such as customer relationship management (CRM), automated customer support, data analysis, and any application requiring interaction between AI agents and Salesforce data.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Sf Mcp Server
sf-mcp-server is an MCP Server designed for Open AI Agents to interact seamlessly with the Salesforce.com platform, facilitating end-to-end communication between a Salesforce organization and a large language model (LLM) agent using the Model Context Protocol (MCP).
Use cases
Use cases for sf-mcp-server include automating customer inquiries through AI-driven chatbots, generating reports from Salesforce data, and enhancing user interactions by leveraging AI capabilities for personalized services.
How to use
To use sf-mcp-server, set up a Salesforce Connected App, configure the necessary environment variables, and run the server using Python. The server exposes core Salesforce REST operations as MCP tools, which can be accessed by the GPT-4-powered agent to perform various tasks.
Key features
Key features of sf-mcp-server include: 1) FastMCP server implementation for efficient communication; 2) Integration with Salesforce for data read/write operations; 3) Autodiscovery of MCP tools by the GPT-4 agent; 4) Support for OAuth 2.0 for secure authentication.
Where to use
sf-mcp-server can be used in various fields such as customer relationship management (CRM), automated customer support, data analysis, and any application requiring interaction between AI agents and Salesforce data.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Salesforce MCP Server
This repository shows end‑to‑end plumbing between a Salesforce org and an LLM agent using the Model Context Protocol (MCP).
server.py— a FastMCP server running on localhost that exposes core Salesforce REST operations as MCP tools and resources.agent.py— a GPT‑4‑powered assistant that autodiscovers those tools (via SSE or streamable‑HTTP) and calls them when needed.
## Table of Contents
- Prerequisites
- Folder layout
- Salesforce Connected‑App setup
- Environment variables
- Quick start
- Testing the server
- Troubleshooting
- Deploying to production
- License
## Prerequisites
| Tool / account | Why you need it |
|---|---|
| Python 3.11 + | Languages features & wheels for FastMCP / OpenAI Agents |
| Salesforce org (prod or sandbox) | Where the MCP server will read/write data |
| OpenAI account & API key | The agent runs on GPT‑4‑o or any functions‑capable model |
(Optional) uv |
Ultra‑fast Python package manager (can replace pip) |
## Folder layout
salesforce-mcp-demo/ ├── server.py # FastMCP server (localhost) ├── agent.py # GPT‑4o agent ├── requirements.txt # Python deps ├── .env.example # copy → .env, add secrets └── README.md # this doc
## Salesforce Connected‑App setup
-
Setup → Apps → App Manager → New Connected App
-
OAuth settings
- Enable OAuth 2.0 & choose Client Credentials (or JWT Bearer).
- Add the
apiscope.
-
Click Save, then copy the Consumer Key & Consumer Secret.
-
Under Manage → Edit Policies
- Permitted Users → Admin approved users are pre‑authorized
- Assign a locked‑down Integration User permission set.
🛡️ Least‑privilege matters. Give the Integration User access only to the objects & fields you want the LLM to touch.
## Environment variables
Create .env from the template and fill the blanks:
# ── Salesforce ─────────────────────────────── SF_CLIENT_ID= SF_CLIENT_SECRET= SF_TOKEN_URL=https://login.salesforce.com/services/oauth2/token # sandbox? use https://test.salesforce.com/... SF_API_VERSION=60.0 # ── OpenAI ─────────────────────────────────── OPENAI_API_KEY=sk‑... OPENAI_MODEL=gpt-4o-mini # any model that supports function calling
## Quick start
# 1 · clone & enter folder
$ git clone https://github.com/your‑org/salesforce‑mcp‑demo.git
$ cd salesforce‑mcp‑demo
# 2 · create / activate virtual‑env
$ python -m venv .venv && source .venv/bin/activate # PowerShell: .\.venv\Scripts\activate
# 3 · install deps (pip or uv)
$ pip install -r requirements.txt
# or : uv pip install -r requirements.txt
# 4 · copy env file & add secrets
$ cp .env.example .env && nano .env
# 5 · start the MCP client (this will start also mcp-server as a subprocess)
$ py mcp-agent.py
If everything is wired up you’ll see lines like:
=== Agent reply === Here are some Leads: 1. **Bertha Boxer** - Company: Farmers Coop. of Florida - Status: Working - Contacted 2. **Phyllis Cotton** - Company: Abbott Insurance - Status: Open - Not Contacted 3. **Jeff Glimpse** - Company: Jackson Controls - Status: Open - Not Contacted 4. **Mike Braund** - Company: Metropolitan Health Services - Status: Open - Not Contacted 5. **Patricia Feager** - Company: International Shipping Co. - Status: Working - Contacted
## Deploying to production
- Dockerise (see commented
Dockerfilein repo) and push to Render / Fly.io / Cloud Run. - Front with API Gateway (NGINX, AWS ALB, Cloud Armor) for TLS, rate‑limits, WAF.
- Rotate tokens — switch to OAuth 2.0 JWT Bearer or client‑credential flow with short‑lived tokens.
- Observability — log every
tool.invoketo Salesforce Event Monitoring or external SIEM.
## Model Context Protocol details
## License
MIT © 2025 Gianluca Tessitore / Atlantic Technologies
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










