How to Deploy an MCP App to Production
Deploy your sunpeak MCP App to production.
TL;DR: Run sunpeak build to compile your app, then sunpeak start to run the production server. Export an auth() function from src/server.ts to authenticate requests. Point ChatGPT or Claude at your /mcp endpoint. Any Node.js 20+ host works.
Building an MCP App locally with sunpeak dev is fast. Deploying it is a two-command operation — sunpeak build then sunpeak start — but there are details worth getting right before you point real AI hosts at your server: authentication, environment variables, streaming proxy config, and connecting each host. This guide covers all of it.
Step 1: Build for Production
Run sunpeak build from your project root:
sunpeak build
This does three things:
-
Compiles each resource in
src/resources/into a self-contained HTML file atdist/{name}/{name}.htmlwith a metadata sidecar atdist/{name}/{name}.json. The HTML has all JavaScript and CSS inlined — no external script tags, no CDN dependencies. This is what AI hosts load into iframes when your tool is called. The JSON contains the resource URI (with a cache-bust timestamp), title, description, and any_metaconfig (CSP, permissions). -
Compiles tool handlers from
src/tools/into Node.js ESM modules atdist/tools/{name}.js. -
Compiles
src/server.ts(if it exists) intodist/server.js. This is where yourauth()function and server config live.
If the build fails, fix the errors before proceeding. A failed build means either a TypeScript error in your resource components or a misconfigured tool file. Both will cause runtime errors in production if not caught here.
Check that dist/ was created with your resources, tools, and server entry:
dist/
├── contact/
│ ├── contact.html ← self-contained resource bundle
│ └── contact.json ← resource metadata (URI, title, _meta)
├── tools/
│ └── show-contact.js ← compiled tool handler
└── server.js ← compiled auth + server config
Step 2: Configure Authentication
Create src/server.ts if you don’t have one. This file is your server entry point — it runs on every MCP request before any tool handler is called.
Export an auth() function that validates the incoming request and returns an AuthInfo object, or null to reject with a 401:
import type { IncomingMessage } from 'node:http';
import type { AuthInfo } from 'sunpeak/mcp';
export async function auth(req: IncomingMessage): Promise<AuthInfo | null> {
const token = req.headers.authorization?.replace('Bearer ', '');
if (!token) {
// Require authentication — reject unauthenticated requests
return null;
}
// Validate the token against your auth provider or database
const user = await verifyToken(token);
if (!user) return null;
return {
token,
clientId: user.id,
scopes: user.scopes,
};
}
export const server = { name: 'My App', version: '1.0.0' };
The AuthInfo you return is passed to every tool handler as extra.authInfo:
export default async function (args: Args, extra: ToolHandlerExtra) {
const userId = extra.authInfo?.clientId;
// Use userId to scope database queries to the authenticated user
const data = await db.getDataForUser(userId);
return { structuredContent: data };
}
If you don’t need authentication — for example, a public tool that returns generic data — you can return an AuthInfo with an empty token and skip the validation:
export async function auth(req: IncomingMessage): Promise<AuthInfo | null> {
return { token: '', clientId: 'anonymous', scopes: [] };
}
Step 3: Set Environment Variables
Your tool handlers read configuration from process.env. Set these on your server before starting the process. Never commit secrets to your repository.
For a local production test before deploying:
DATABASE_URL=postgres://... API_KEY=sk-... sunpeak start
In production, set environment variables through your hosting platform:
- Docker / docker-compose:
environment:block indocker-compose.yml, or-e KEY=VALUEflag - systemd:
Environment=KEY=VALUEin the service unit file - Fly.io:
fly secrets set DATABASE_URL=postgres://... - Railway: Project settings → Variables
- AWS ECS: Task definition environment variables
Read them in tool handlers:
export default async function (args: Args, extra: ToolHandlerExtra) {
const db = new Client(process.env.DATABASE_URL);
// ...
}
Step 4: Start the Production Server
sunpeak start
The MCP server starts on port 8000 by default. Set PORT to change it:
PORT=3000 sunpeak start
Your MCP endpoint is at http://your-server:PORT/mcp. This is the URL you’ll give to ChatGPT and Claude.
The server handles the full MCP protocol from this endpoint:
- Tool and resource manifests — the list of tools and resources your app exposes, read by the host on connection
- Tool calls — the host sends tool call requests, the server validates inputs against your Zod schemas, runs your handler, and returns the result
- Resource HTML — the pre-built HTML bundles from
dist/, served to host iframes when a tool returns structured content, or pre-fetched prior to tool calls
You can verify the server is running with the health endpoint:
curl http://localhost:8000/health
Step 5: Reverse Proxy and TLS
AI hosts require your MCP endpoint to use HTTPS. In production, terminate TLS at a reverse proxy (nginx, Caddy, Cloudflare Tunnel) and forward traffic to sunpeak start.
The MCP protocol uses Streamable HTTP for communication, which includes streaming responses. Make sure your proxy does not buffer these. In nginx:
location /mcp {
proxy_pass http://localhost:8000;
proxy_buffering off;
proxy_cache off;
proxy_set_header Connection '';
proxy_http_version 1.1;
chunked_transfer_encoding on;
}
With Caddy, response buffering is off by default — no extra config needed.
With Cloudflare, make sure the route is set to “No Transform” to avoid response buffering.
Step 6: Connect to ChatGPT
In ChatGPT, go to User > Settings > Apps & Connectors > Create. Enter your server URL with the /mcp path:
https://sunpeak-app.example.com/mcp
ChatGPT fetches your tool manifest from /mcp and registers your tools. The next time a user asks something that triggers one of your tools, ChatGPT calls your tool handler and renders the resource HTML in an iframe inside the conversation.
If your server requires bearer token authentication, enter the token in the Authentication section of the connector setup form. ChatGPT sends it as Authorization: Bearer <token> on every request.
Step 7: Connect to Claude
In Claude, go to Settings > Connectors > Add custom connector. Enter the same server URL with the /mcp path:
https://sunpeak-app.example.com/mcp
Claude discovers your tools and resources automatically. If your server requires authentication, configure the bearer token or OAuth in the connector settings.
Keeping the Server Running
Use a process manager to keep sunpeak start running after deploys and restarts.
pm2:
pm2 start "sunpeak start" --name sunpeak-app
pm2 save
pm2 startup
systemd:
[Unit]
Description=My MCP App
After=network.target
[Service]
WorkingDirectory=/opt/sunpeak-app
ExecStart=/usr/local/bin/sunpeak start
Restart=always
Environment=PORT=8000
Environment=DATABASE_URL=postgres://...
[Install]
WantedBy=multi-user.target
Docker:
FROM node:20-alpine
WORKDIR /app
COPY package.json pnpm-lock.yaml ./
RUN npm install -g pnpm && pnpm install --frozen-lockfile
COPY . .
RUN pnpm exec sunpeak build
EXPOSE 8000
CMD ["pnpm", "exec", "sunpeak", "start"]
Deploying on Fly.io
Fly.io is a good fit for MCP Apps: global regions, automatic TLS, and straightforward Node.js deployments.
Create a fly.toml:
app = "my-mcp-app"
primary_region = "ord"
[build]
[http_service]
internal_port = 8000
force_https = true
auto_stop_machines = "stop"
auto_start_machines = true
[[vm]]
size = "shared-cpu-1x"
memory = "256mb"
Add a Dockerfile using the example above, then deploy:
fly launch
fly secrets set DATABASE_URL=postgres://...
fly deploy
Your MCP endpoint will be at https://my-mcp-app.fly.dev/mcp.
CI/CD: Build Before Deploy
Add a build step to your CI/CD pipeline before deploying. GitHub Actions example:
name: Deploy
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: pnpm/action-setup@v4
with:
version: 10
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'pnpm'
- run: pnpm install
- run: pnpm test
- run: pnpm test:e2e
- run: pnpm exec sunpeak build
# Deploy dist/ and server files to your host
- name: Deploy to Fly.io
run: fly deploy
env:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
The unit tests (pnpm test) and e2e tests (pnpm test:e2e) run in the inspector against both the ChatGPT and Claude hosts before the build. If they pass, you know your resources render correctly in both environments before you ship. See the complete testing guide for how to set these up.
Smoke Testing After Deploy
After deploying, verify the server is healthy:
curl https://sunpeak-app.example.com/health
If it returns a response, your server is running and reachable. Open ChatGPT or Claude, trigger one of your tools, and confirm the resource renders correctly.
Get Started
pnpm add -g sunpeak && sunpeak new
Further Reading
- MCP App Tutorial - build your first resource and tool from scratch.
- Complete Guide to Testing MCP Apps - set up Vitest and Playwright before you ship.
- MCP App Error Handling - handle loading, error, and cancelled states in production.
- Claude Connector OAuth Authentication - add OAuth to your MCP App for Claude.
- Deployment guide
- CLI reference: sunpeak build
- CLI reference: sunpeak start
- Server authentication
Frequently Asked Questions
What commands do I run to deploy an MCP App?
Run "sunpeak build" to compile your resources and tools, then "sunpeak start" to launch a production MCP server. The server listens on port 8000 by default and exposes your MCP endpoint at /mcp. Point any MCP-compatible host (ChatGPT, Claude) to that URL to connect.
What does sunpeak build do?
sunpeak build compiles each resource in src/resources/ into a self-contained HTML bundle with all JavaScript and CSS inlined, plus a JSON metadata sidecar with the resource URI and config. It compiles tool handlers from src/tools/ into Node.js ESM modules and compiles src/server.ts (if present) into dist/server.js. The output goes into dist/. All steps must succeed before you can run sunpeak start.
How do I authenticate users in an MCP App?
Create src/server.ts and export an async auth() function. The function receives every incoming HTTP request and returns an AuthInfo object (token, clientId, scopes) to allow it, or null to reject with a 401. The AuthInfo you return is available as extra.authInfo inside every tool handler.
What port does sunpeak start use?
sunpeak start listens on port 8000 by default. Set the PORT environment variable to override it. Your MCP endpoint is at http://your-host:PORT/mcp.
Do I need a special server to host an MCP App?
No. sunpeak start runs a standard Node.js HTTP server. You can host it on any platform that runs Node.js 20+: a VPS (DigitalOcean, Hetzner, Linode), a container on Fly.io or Railway, a serverless platform like AWS Lambda behind an API Gateway, or a managed Node.js service. The only requirement is that the /mcp endpoint is publicly reachable by the AI host (ChatGPT or Claude).
How do I connect a deployed MCP App to ChatGPT?
In ChatGPT, go to User > Settings > Apps & Connectors > Create. Enter your server URL with the /mcp path (e.g., https://sunpeak-app.example.com/mcp). ChatGPT will fetch the tool and resource manifests and connect. For OAuth, configure the auth settings in the connector setup form.
How do I connect a deployed MCP App to Claude?
In Claude, go to Settings > Connectors > Add custom connector. Enter your server URL with the /mcp path. Claude will connect and discover your tools and resources automatically. If your server requires authentication, configure the bearer token or OAuth in the connector settings.
How do I pass environment variables to my MCP App in production?
Set environment variables on your server before running sunpeak start. In a systemd service or Docker container, add them to the environment block. On Fly.io, use fly secrets set KEY=VALUE. On Railway, add them in the project settings. Read them in your tool handlers with process.env.KEY.
Can I run sunpeak start behind a reverse proxy like nginx?
Yes. Point nginx or any reverse proxy to http://localhost:8000 and terminate TLS there. The MCP protocol uses Streamable HTTP for communication, so make sure your proxy does not buffer responses: set proxy_buffering off in nginx, or use the appropriate setting for your proxy.