Runtime
Read runtime-specific guides, examples, and API references
ModelFetch provides runtime-specific packages that handle tedious platform differences while you focus on building your MCP server capabilities. Each package maintains a consistent API across different runtimes.
Available Runtimes
Node.js
Run simple MCP servers with Node.js
Next.js
Run flexible MCP servers with Next.js
Bun
Run lightning-fast MCP servers with Bun
Deno
Run secure MCP servers with Deno
AWS Lambda
Deploy MCP servers to AWS Lambda
Vercel
Deploy MCP servers to Vercel
Cloudflare
Deploy MCP servers to Cloudflare
Netlify
Deploy MCP servers to Netlify
Future Runtimes
We plan to support these additional runtimes:
- Azure Functions - Deploy to Microsoft's serverless platform
- Fastly Compute - Deploy to Fastly's edge compute platform
- Supabase Functions - Run alongside your Supabase backend
- Ali Function Compute - Deploy to Alibaba Cloud's serverless service
- Service Worker - Run in browser service workers