← Our stack

Railway for persistent backends.

When the workload needs a long-running process, persistent connections, or a container that stays alive.Railway hosts it.

railway.com ↗
How we use Railway

Railway handles the backend workloads that don't fit the serverless model. WebSocket servers, background job processors, real-time communication services, and AI agent runtimes all need processes that stay alive between requests. Railway gives us container-based hosting with automatic deploys from git, built-in databases, and straightforward scaling. We use it alongside Vercel.the frontend and API routes deploy on Vercel, and long-running backend services deploy on Railway. Each platform handles what it does best.

Not everything is serverless. Railway runs the things that need to stay on.
Why Railway

Railway is the simplest path from a Dockerfile to a running service. No Kubernetes configuration, no infrastructure-as-code boilerplate, no multi-step CI pipeline. Push to git, it deploys. Need a Postgres database? Click a button. Need Redis? Same. For the backend services that don't fit into serverless functions.WebSocket servers, background workers, AI agent processes.Railway provides the persistent compute without the operational overhead of managing EC2 instances or container orchestration.

Where we use it

Real-Time Services

WebSocket servers and Socket.io backends for live chat, notifications, and collaborative features.

AI Agent Runtimes

Long-running AI agent processes that maintain state across conversations and need persistent connections.

Background Workers

Job queues, scheduled tasks, and data processing pipelines that run continuously outside the request-response cycle.

Get started

Let's talk about
your next build.