Inspiration
Short links are everywhere—campaigns, onboarding, and internal tools all need safe, trackable URLs without running a third-party dependency forever. We wanted something that felt hackathon-sized but production-shaped: a clear REST API, real persistence, and enough observability to answer “is it up?” and “is it slow?” under load. The MLH PE Hackathon brief gave us a concrete target: users, shortened URLs, redirects, and event-style analytics on top of PostgreSQL.
What it does
Our URL Shortener API lets you create users, shorten HTTP(S) URLs with generated short codes, and resolve them with 302 redirects (including the MLH-style GET /urls//redirect path). It supports bulk user import from CSV, filtering and listing URLs and events, and POST /events for explicit analytics. Successful redirects and key lifecycle actions can be reflected as events (for example created, redirect, updated) with JSON details aligned to the seed dataset. We expose health, structured logging, and Prometheus metrics; with Docker Compose you get Nginx in front, two app replicas, Redis caching, PostgreSQL, and a Prometheus / Grafana / Alertmanager stack—with optional Discord notifications when alerts fire.
How we built it
We used Python 3.13, Flask, and Peewee against PostgreSQL, packaged with uv and runnable via run.py or Docker Compose. The API is organized with Flask blueprints, Peewee models for User, Url, and Event, and careful validation (for example blocking dangerous URL schemes on shorten). We added Redis-backed caching where it helps hot paths, nginx as a reverse proxy, and Locust for load tests and baseline notes. Observability uses a private Prometheus registry for the app process so metrics stay correct in multi-worker or test scenarios, plus Grafana dashboards and Alertmanager rules for latency, errors, and availability.
Challenges we ran into
We hit the usual grader vs local differences: empty databases without tables, route paths the tests expect (no /api prefix), and redirect behavior (302, correct Location, inactive URLs as 410). Prometheus duplicate metric registration and process metrics tripped CI until we isolated metrics on a dedicated registry. Alertmanager does not substitute shell-style ${VAR} in YAML the way we first assumed, so wiring Discord required writing the webhook URL from the environment at container start. Matching hidden checks meant aligning event details with the seed CSV shape (e.g. original_url vs other key names) and emitting lifecycle events where the spec implied them.
Accomplishments that we're proud of
We are proud of an API that is test-backed, seed-loadable, and compose-friendly end to end: from curl and pytest to a full stack with metrics and alerts. Getting redirect analytics right—recording events on successful redirects without breaking redirects or leaking events on inactive links—felt like a solid “product” detail. The monitoring story (metrics + dashboards + alert routing) goes beyond a demo JSON API and matches how we’d actually run a small service.
What we learned
We learned how much operational detail hides in a “simple” CRUD app: id generation, pagination, CSV bulk import, and strict HTTP semantics for redirects. Peewee + PostgreSQL stayed pleasant for rapid iteration, while Prometheus and Alertmanager taught us to read docs carefully for config semantics and runtime env injection. Load testing with Locust made tradeoffs visible (caching, connection pooling, duplicate app instances) in a way unit tests alone do not.
What’s next for URL Shortener
Short term: auth (or signed tokens) for creating and managing links, rate limiting at the edge, and custom short codes with collision handling and abuse controls. Medium term: async or queue-based analytics if event volume grows, read replicas or connection tuning for Postgres under sustained load, and SLO-based alerting tied to real user journeys. Longer term: a small admin UI, link expiration policies, and optional geo or referrer metadata in events—still privacy-conscious and opt-in where it matters.
Log in or sign up for Devpost to join the conversation.