Deployment
Guide to deploying BrewHoard to production, including configuration, environment setup, and monitoring.
This guide covers deploying BrewHoard to production environments, including configuration best practices and monitoring setup.
Deployment Options
BrewHoard can be deployed to various platforms:
| Platform | Best For | Considerations |
|---|---|---|
| Vercel | Simplest deployment | Native SvelteKit support |
| Cloudflare Pages | Edge performance | Workers adapter needed |
| Railway | Full-stack with DB | Managed PostgreSQL |
| Docker | Self-hosted | Full control |
| Node.js Server | Traditional hosting | Manual setup |
Environment Variables
Required Variables
Bash
# Database
DATABASE_URL="postgres://user:pass@host:5432/brewhoard"
# Session security (generate with: openssl rand -hex 32)
SESSION_SECRET="your-64-character-hex-secret"
# Stripe payments
STRIPE_SECRET_KEY="sk_live_..."
STRIPE_PUBLISHABLE_KEY="pk_live_..."
STRIPE_WEBHOOK_SECRET="whsec_..."
# Application
PUBLIC_APP_URL="https://brewhoard.com"
NODE_ENV="production"Optional Variables
Bash
# Vision API (for beer scanner)
VISION_API_KEY="your-vision-api-key"
# Email (SMTP)
SMTP_HOST="smtp.sendgrid.net"
SMTP_PORT="587"
SMTP_USER="apikey"
SMTP_PASS="your-api-key"
SMTP_FROM="noreply@brewhoard.com"
# File storage (S3-compatible)
S3_BUCKET="brewhoard-uploads"
S3_REGION="us-east-1"
S3_ACCESS_KEY="AKIA..."
S3_SECRET_KEY="..."
S3_ENDPOINT="https://s3.amazonaws.com"
# Redis (for sessions/caching)
REDIS_URL="redis://localhost:6379"
# Analytics
PLAUSIBLE_DOMAIN="brewhoard.com"Vercel Deployment
The simplest way to deploy BrewHoard:
1. Connect Repository
Bash
# Install Vercel CLI
npm i -g vercel
# Deploy
vercel2. Configure Environment
In Vercel dashboard → Settings → Environment Variables:
Text
DATABASE_URL = postgres://...
SESSION_SECRET = ...
STRIPE_SECRET_KEY = sk_live_...3. Add Vercel Postgres (Optional)
Bash
# Link Vercel Postgres
vercel env pull4. vercel.json Configuration
JSON
{
"framework": "sveltekit",
"regions": ["iad1"],
"functions": {
"src/routes/api/**/*.js": {
"maxDuration": 30
}
}
}Docker Deployment
Dockerfile
Dockerfile
# Build stage
FROM node:20-alpine AS builder
WORKDIR /app
# Install dependencies
COPY package*.json ./
RUN npm ci
# Copy source and build
COPY . .
RUN npm run build
# Production stage
FROM node:20-alpine AS runner
WORKDIR /app
# Copy built application
COPY --from=builder /app/build build/
COPY --from=builder /app/package.json .
COPY --from=builder /app/node_modules node_modules/
# Create non-root user
RUN addgroup -g 1001 nodejs &&
adduser -S -u 1001 -G nodejs svelte
USER svelte
EXPOSE 3000
ENV NODE_ENV=production
ENV PORT=3000
CMD ["node", "build"]docker-compose.yml
YAML
version: '3.8'
services:
app:
build: .
ports:
- "3000:3000"
environment:
- DATABASE_URL=postgres://brewhoard:password@db:5432/brewhoard
- SESSION_SECRET=${SESSION_SECRET}
- STRIPE_SECRET_KEY=${STRIPE_SECRET_KEY}
depends_on:
- db
- redis
restart: unless-stopped
db:
image: postgres:15-alpine
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
- POSTGRES_USER=brewhoard
- POSTGRES_PASSWORD=password
- POSTGRES_DB=brewhoard
restart: unless-stopped
redis:
image: redis:7-alpine
volumes:
- redis_data:/data
restart: unless-stopped
volumes:
postgres_data:
redis_data:Deploy with Docker
Bash
# Build and start
docker-compose up -d --build
# Run migrations
docker-compose exec app npm run migrate
# View logs
docker-compose logs -f appDatabase Setup
Production PostgreSQL
Bash
# Create database
createdb brewhoard_prod
# Run migrations
DATABASE_URL="postgres://..." npm run migrateConnection Pooling
For production, use connection pooling:
JavaScript
// src/lib/server/db.js
import postgres from 'postgres';
const sql = postgres(process.env.DATABASE_URL, {
max: 20, // Max connections
idle_timeout: 20, // Close idle after 20s
connect_timeout: 10, // Connection timeout
ssl: process.env.NODE_ENV === 'production' ? 'require' : false,
prepare: false // Required for some poolers
});
export default sql;Backup Strategy
Bash
# Daily backup script
#!/bin/bash
DATE=$(date +%Y%m%d)
pg_dump $DATABASE_URL | gzip > /backups/brewhoard_$DATE.sql.gz
# Keep last 30 days
find /backups -name "brewhoard_*.sql.gz" -mtime +30 -deleteBuild Configuration
svelte.config.js for Production
JavaScript
import adapter from '@sveltejs/adapter-node';
// Or: import adapter from '@sveltejs/adapter-vercel';
/** @type {import('@sveltejs/kit').Config} */
const config = {
kit: {
adapter: adapter({
// Node adapter options
out: 'build',
precompress: true,
envPrefix: ''
}),
// CSP headers
csp: {
mode: 'auto',
directives: {
'script-src': ['self'],
'style-src': ['self', 'unsafe-inline'],
'img-src': ['self', 'data:', 'https://cdn.brewhoard.com'],
'connect-src': ['self', 'https://api.stripe.com']
}
}
}
};
export default config;Production Build
Bash
# Build for production
npm run build
# Preview production build locally
npm run previewStripe Webhook Setup
Configure Webhook Endpoint
In Stripe Dashboard → Developers → Webhooks:
- Add endpoint:
https://brewhoard.com/api/v1/payments/webhook - Select events:
payment_intent.succeededpayment_intent.payment_failedcharge.refunded
Webhook Handler
JavaScript
// src/routes/api/v1/payments/webhook/+server.js
import Stripe from 'stripe';
const stripe = new Stripe(process.env.STRIPE_SECRET_KEY);
export async function POST({ request }) {
const payload = await request.text();
const sig = request.headers.get('stripe-signature');
let event;
try {
event = stripe.webhooks.constructEvent(
payload,
sig,
process.env.STRIPE_WEBHOOK_SECRET
);
} catch (err) {
console.error('Webhook signature verification failed');
return new Response('Invalid signature', { status: 400 });
}
switch (event.type) {
case 'payment_intent.succeeded':
await handlePaymentSuccess(event.data.object);
break;
case 'payment_intent.payment_failed':
await handlePaymentFailure(event.data.object);
break;
}
return new Response('OK');
}PWA Configuration
Service Worker
JavaScript
// src/service-worker.js
import { build, files, version } from '$service-worker';
const CACHE = `cache-${version}`;
const ASSETS = [...build, ...files];
self.addEventListener('install', (event) => {
event.waitUntil(
caches.open(CACHE).then((cache) => cache.addAll(ASSETS))
);
});
self.addEventListener('activate', (event) => {
event.waitUntil(
caches.keys().then((keys) =>
Promise.all(
keys.filter(key => key !== CACHE).map(key => caches.delete(key))
)
)
);
});
self.addEventListener('fetch', (event) => {
if (event.request.method !== 'GET') return;
event.respondWith(
caches.match(event.request).then((cached) => {
return cached || fetch(event.request);
})
);
});Web Manifest
JSON
// static/manifest.json
{
"name": "BrewHoard",
"short_name": "BrewHoard",
"description": "Manage your beer collection",
"start_url": "/",
"display": "standalone",
"background_color": "#ffffff",
"theme_color": "#f59e0b",
"icons": [
{
"src": "/pwa-192x192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "/pwa-512x512.png",
"sizes": "512x512",
"type": "image/png"
}
]
}Security Checklist
Before Go-Live
- All secrets in environment variables (not in code)
-
SESSION_SECRETis unique and secure (64+ chars) - HTTPS enforced (redirect HTTP)
- Database SSL enabled
- CORS configured correctly
- Rate limiting enabled
- CSP headers configured
- Input validation on all endpoints
- SQL injection protection (parameterized queries)
- XSS protection (content sanitization)
- File upload restrictions (type, size)
Security Headers
JavaScript
// src/hooks.server.js
export async function handle({ event, resolve }) {
const response = await resolve(event);
// Security headers
response.headers.set('X-Frame-Options', 'DENY');
response.headers.set('X-Content-Type-Options', 'nosniff');
response.headers.set('Referrer-Policy', 'strict-origin-when-cross-origin');
response.headers.set('Permissions-Policy', 'camera=(self), microphone=()');
return response;
}Monitoring
Health Check Endpoint
JavaScript
// src/routes/api/v1/health/+server.js
import sql from '$lib/server/db.js';
export async function GET() {
const checks = {
status: 'ok',
timestamp: new Date().toISOString(),
services: {}
};
// Check database
try {
await sql`SELECT 1`;
checks.services.database = 'ok';
} catch {
checks.services.database = 'error';
checks.status = 'degraded';
}
return json(checks, {
status: checks.status === 'ok' ? 200 : 503
});
}Error Logging
JavaScript
// src/hooks.server.js
import { dev } from '$app/environment';
export async function handleError({ error, event }) {
const errorId = crypto.randomUUID();
// Log error details
console.error({
id: errorId,
url: event.url.pathname,
method: event.request.method,
error: error.message,
stack: error.stack
});
// In production, send to error tracking service
if (!dev) {
// await sendToSentry(error, { id: errorId, event });
}
return {
message: 'An unexpected error occurred',
id: errorId
};
}Performance Optimization
Caching Headers
JavaScript
// src/routes/api/v1/beers/+server.js
export async function GET({ setHeaders }) {
const beers = await getBeers();
// Cache for 5 minutes
setHeaders({
'Cache-Control': 'public, max-age=300'
});
return json({ data: beers });
}Database Query Optimization
SQL
-- Add indexes for common queries
CREATE INDEX CONCURRENTLY idx_collection_user_updated
ON user_collection(user_id, updated_at DESC);
-- Analyze query performance
EXPLAIN ANALYZE SELECT ...Rollback Procedure
If deployment fails:
Bash
# Vercel: Instant rollback
vercel rollback
# Docker: Rollback to previous image
docker-compose down
docker tag brewhoard:current brewhoard:failed
docker tag brewhoard:previous brewhoard:current
docker-compose up -d
# Database: Restore from backup
pg_restore -d brewhoard_prod /backups/brewhoard_YYYYMMDD.sql.gzNext Steps
- Production Checklist - Complete go-live checklist
- PWA Configuration - Progressive Web App setup
- Monitoring - Application monitoring