Awesome-omni-skill MCP Server Architecture
This skill should be used when the user asks to "create an MCP server", "set up MCP server", "build ChatGPT app backend", "MCP transport type", "configure MCP endpoint", "server setup for Apps SDK", or needs guidance on MCP server architecture, transport protocols, or SDK setup for the OpenAI Apps SDK.
git clone https://github.com/diegosouzapw/awesome-omni-skill
T=$(mktemp -d) && git clone --depth=1 https://github.com/diegosouzapw/awesome-omni-skill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/backend/mcp-server-architecture" ~/.claude/skills/diegosouzapw-awesome-omni-skill-mcp-server-architecture && rm -rf "$T"
skills/backend/mcp-server-architecture/SKILL.md- pip install
- references .env files
- references API keys
MCP Server Architecture for OpenAI Apps SDK
Overview
MCP (Model Context Protocol) servers form the backend for ChatGPT apps built with the OpenAI Apps SDK. The server exposes tools that ChatGPT can invoke, handles authentication, and returns structured data that powers both model responses and widget UIs.
Core Architecture
An MCP server for the Apps SDK implements three essential capabilities:
- List tools - Advertise available tools with JSON Schema contracts
- Call tools - Execute tool logic and return structured responses
- Return widgets - Provide UI templates via resource URIs and
fields_meta
Data Flow
User prompt → ChatGPT calls MCP tool → Server executes logic → Returns structuredContent + _meta → ChatGPT renders widget + narrates
Transport Types
The Apps SDK supports two transport protocols:
Streamable HTTP (Recommended)
Primary transport for production deployments. Use for publicly accessible servers.
Python (FastMCP):
from mcp.server.fastmcp import FastMCP mcp = FastMCP("my-server") @mcp.tool() def my_tool(param: str) -> str: return f"Result: {param}" if __name__ == "__main__": mcp.run(transport="streamable-http", host="0.0.0.0", port=8000)
TypeScript:
import { Server } from "@modelcontextprotocol/sdk/server/index.js"; import { StreamableHTTPServerTransport } from "@modelcontextprotocol/sdk/server/streamableHttp.js"; const server = new Server({ name: "my-server", version: "1.0.0" }, { capabilities: { tools: {} } }); const transport = new StreamableHTTPServerTransport({ sessionIdGenerator: () => crypto.randomUUID() }); await server.connect(transport);
Server-Sent Events (SSE)
Alternative transport for event-streaming requirements.
Python:
mcp.run(transport="sse", host="0.0.0.0", port=8000)
TypeScript:
import { SSEServerTransport } from "@modelcontextprotocol/sdk/server/sse.js"; const transport = new SSEServerTransport("/mcp", response);
SDK Setup
Python Setup
Install the MCP Python SDK:
pip install mcp # Or with FastAPI support pip install "mcp[fastapi]"
Minimal server structure:
from mcp.server.fastmcp import FastMCP mcp = FastMCP("server-name") @mcp.tool() def example_tool(query: str) -> dict: """Tool description for the model.""" return {"result": query} @mcp.resource("ui://widget/main.html") def get_widget() -> str: return "<html>...</html>" if __name__ == "__main__": mcp.run(transport="streamable-http", port=8000)
TypeScript Setup
Install the MCP TypeScript SDK:
npm install @modelcontextprotocol/sdk zod
Minimal server structure:
import { Server } from "@modelcontextprotocol/sdk/server/index.js"; import { z } from "zod"; const server = new Server( { name: "server-name", version: "1.0.0" }, { capabilities: { tools: {}, resources: {} } } ); server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools: [{ name: "example_tool", description: "Tool description", inputSchema: { type: "object", properties: { query: { type: "string" } } } }] })); server.setRequestHandler(CallToolRequestSchema, async (request) => { if (request.params.name === "example_tool") { return { content: [{ type: "text", text: "Result" }] }; } });
Response Structure
Tool responses include three layers:
| Field | Visibility | Purpose |
|---|---|---|
| Model + Widget | Concise JSON the model reads for narration |
| Model + Widget | Text/image content for display |
| Widget only | Rich data exclusively for UI rendering |
Example response:
return { "structuredContent": {"status": "success", "count": 42}, "content": [{"type": "text", "text": "Found 42 items"}], "_meta": { "items": [...], # Full data for widget "openai/outputTemplate": "ui://widget/list.html" } }
Server Configuration Best Practices
Port and Host
- Use port 8000 by default for local development
- Bind to
for container deployments0.0.0.0 - Bind to
for local-only access127.0.0.1
HTTPS Requirements
ChatGPT requires HTTPS for all production MCP servers. Use ngrok during development:
ngrok http 8000
Environment Variables
Store sensitive configuration in environment variables:
import os API_KEY = os.environ.get("API_KEY") DATABASE_URL = os.environ.get("DATABASE_URL")
Error Handling
Return structured errors the model can understand:
@mcp.tool() def safe_tool(param: str) -> dict: try: result = process(param) return {"success": True, "data": result} except ValueError as e: return {"success": False, "error": str(e)}
Project Structure
Recommended directory layout for MCP server projects:
my-mcp-server/ ├── server.py # or server.ts ├── tools/ │ ├── __init__.py │ └── my_tool.py ├── widgets/ │ └── main.html ├── requirements.txt # or package.json └── .env.example
Additional Resources
Reference Files
For detailed SDK documentation and patterns:
- Python SDK detailed referencereferences/python-sdk.md
- TypeScript SDK detailed referencereferences/typescript-sdk.md
- Transport protocol comparisonreferences/transport-comparison.md
Example Files
Working server examples in
examples/:
- Minimal Python MCP serverexamples/minimal-server.py
- Minimal TypeScript MCP serverexamples/minimal-server.ts
Official Documentation
- Apps SDK Docs: https://developers.openai.com/apps-sdk/
- MCP Specification: https://modelcontextprotocol.io/specification/
- Python SDK: https://github.com/modelcontextprotocol/python-sdk
- TypeScript SDK: https://github.com/modelcontextprotocol/typescript-sdk