Skip to main content

Docker vs Vercel-Only: Deployment Strategies in SaaS Boilerplates

·StarterPick Team
dockerverceldeploymentboilerplate2026

TL;DR

Vercel-only for most indie SaaS: zero ops, instant deploys, great DX. Docker when you need persistent processes (background workers, WebSocket servers), cost control at scale, or multi-service deployments. Most teams start Vercel, introduce Docker containers via Railway or Render when complexity demands it.

What "Vercel-only" Actually Means

Vercel deploys Next.js as serverless functions — each route/page is an isolated function invocation:

User request → Vercel Edge Network → Serverless Function → Response
                                    (spins up for each request)

Pros:

  • Zero server management
  • Scales to zero (no idle cost)
  • Automatic SSL, CDN, domains
  • Git push = deploy (zero config)

Cons:

  • No persistent state (functions die between requests)
  • No background processes
  • Max 300s execution timeout (Enterprise), 30s (Pro)
  • Cold starts (50-500ms for first request after idle)
  • Serverless pricing can exceed container pricing at high volume

What Requires Docker/Containers

Some workloads don't fit the serverless model:

Background Job Queues

// This can't run on Vercel — needs a persistent process
import Queue from 'bullmq';
import { Redis } from 'ioredis';

const redis = new Redis(process.env.REDIS_URL!);
const emailQueue = new Queue('emails', { connection: redis });

// Worker — runs continuously, not as a request handler
emailQueue.process(10, async (job) => {
  const { userId, template, data } = job.data;
  await sendEmail(userId, template, data);
});

console.log('Worker listening...');

BullMQ workers need a persistent Node.js process. Vercel can only enqueue jobs (via the API route); the worker must run on a container (Railway, Render, Fly.io, EC2).

WebSocket Servers

// WebSocket server — persistent connection, not request/response
import { WebSocketServer } from 'ws';

const wss = new WebSocketServer({ port: 8080 });

wss.on('connection', (ws) => {
  ws.on('message', (message) => {
    // Broadcast to all connected clients
    wss.clients.forEach((client) => {
      if (client.readyState === WebSocket.OPEN) {
        client.send(message);
      }
    });
  });
});

Native WebSockets need a persistent server. Vercel supports WebSockets experimentally, but for production real-time apps, a dedicated container is more reliable.

Long-Running Processes

// PDF generation — might take 60-120 seconds for complex documents
// Vercel Pro limit: 60 seconds
// Solution: Queue job, process in background container

// API route — queues the job (runs in < 5s on Vercel)
export async function POST(req: Request) {
  const { reportId } = await req.json();
  await pdfQueue.add('generate', { reportId });
  return Response.json({ status: 'queued', reportId });
}

// Worker container — generates PDF (no timeout)
pdfQueue.process(async (job) => {
  const pdf = await generateComplexReport(job.data.reportId);
  await uploadToS3(pdf, job.data.reportId);
  await notifyUser(job.data.reportId);
});

Docker Setup for SaaS Boilerplates

A typical SaaS Docker setup:

# Dockerfile — multi-stage build
FROM node:20-alpine AS base
WORKDIR /app
COPY package*.json ./

FROM base AS deps
RUN npm ci --only=production

FROM base AS builder
RUN npm ci
COPY . .
RUN npm run build

FROM base AS runner
ENV NODE_ENV=production
COPY --from=deps /app/node_modules ./node_modules
COPY --from=builder /app/.next ./.next
COPY --from=builder /app/public ./public
COPY --from=builder /app/package.json ./package.json

EXPOSE 3000
CMD ["npm", "start"]
# docker-compose.yml — local development
services:
  web:
    build: .
    ports:
      - "3000:3000"
    environment:
      - DATABASE_URL=postgresql://postgres:password@db:5432/myapp
      - REDIS_URL=redis://redis:6379
    depends_on:
      - db
      - redis

  worker:
    build: .
    command: node worker/index.js
    environment:
      - DATABASE_URL=postgresql://postgres:password@db:5432/myapp
      - REDIS_URL=redis://redis:6379
    depends_on:
      - db
      - redis

  db:
    image: postgres:16
    environment:
      - POSTGRES_PASSWORD=password
      - POSTGRES_DB=myapp
    volumes:
      - postgres_data:/var/lib/postgresql/data

  redis:
    image: redis:7-alpine

volumes:
  postgres_data:

The Hybrid Pattern: Vercel + Container

Most production indie SaaS uses a hybrid approach:

┌─────────────────────────────────────────────────────┐
│                   Vercel                             │
│  Next.js App (SSR + API routes for user-facing)     │
│  - Auth, dashboard, settings, checkout              │
│  - Stripe webhooks (fast, < 5s response)            │
└────────────────────────┬────────────────────────────┘
                         │ Enqueue jobs
                         ▼
┌─────────────────────────────────────────────────────┐
│              Railway / Render                        │
│  Background Workers (persistent containers)         │
│  - PDF generation                                   │
│  - Email sending queue                              │
│  - Data processing jobs                             │
│  - Scheduled tasks (cron)                           │
└────────────────────────────────────────────────────┘
// On Vercel — API route triggers background job
export async function POST(req: Request) {
  const data = await req.json();

  // Enqueue via Redis (Upstash on Vercel, or Railway Redis)
  await triggerBackgroundJob('process-user-data', data);

  // Return immediately — job runs in container
  return Response.json({ status: 'processing' });
}

Boilerplate Docker Support

BoilerplateDockerfile includedDocker ComposeWorker setup
T3 Stack❌ (community)
ShipFast
Supastarter
Makerkit
Epic Stack
Bedrock

When to Choose What

Stay Vercel-only when:

  • No background jobs (or small jobs < 30s)
  • No WebSockets needed
  • Team has zero DevOps experience
  • Early stage, validating product

Add Docker containers (Railway/Render) when:

  • Need background job processing
  • Need WebSocket server
  • Jobs routinely exceed 30s
  • Want predictable container pricing vs serverless

Go full Docker/Kubernetes when:

  • Enterprise clients require on-premise or VPC deployment
  • Compliance requires data residency control
  • Traffic is high enough that serverless pricing exceeds container pricing
  • Need custom runtime environments

Find boilerplates by deployment strategy on StarterPick.

Check out this boilerplate

View Vercel on StarterPick →

Comments