Chord-less
Inspiration
Having grown up playing instruments like the piano, violin, and viola, we've always loved exploring new ways to make music. But jumping into completely different instruments often comes with a steep barrier to entry. Between researching the right gear, buying the actual instrument, and going through the initial learning curve, it's hard to justify the commitment just to see if you even enjoy it.
That's why we built Chord-less. We wanted to create a fun, easy, and completely accessible way to experience what playing guitar actually feels like, without the high stakes or the financial commitment. All features are built into devices most people already own. By swinging your phone to strum, which uses dynamic haptic vibrations to genuinely simulate the feeling of hitting individual strings, and using your webcam to shape chords, Chord-less captures the real physical movements and tactile feedback of playing. With built-in features that let you listen to demos, follow a live tutorial mode to learn our simplified hand gestures, and record your own jam sessions, it gives anyone a fun way to make music. Basically, it lets you try the guitar: no strings attached (literally).
What It Does
Chord-less is a multi-device virtual instrument that transforms your everyday hardware into an interactive guitar practice experience. It seamlessly bridges real-time computer vision from your laptop's webcam with high-speed motion telemetry from an iPhone to replicate the physical mechanics of playing.
Instead of wrestling with the steep physical learning curve of a real fretboard, you use your left hand to form chord hand signs in front of the camera. Our live machine learning pipeline tracks your finger positions and instantly locks in the corresponding guitar chord. At the same time, your right hand physically swings your phone to strum. The app calculates your strum's speed, direction, and rotational depth, firing off dynamic haptic vibrations so you can actually feel the virtual strings as you play.
Everything ties together seamlessly on an animated web dashboard. It uses a custom physics-based audio synthesizer to generate rich, realistic string sounds that react perfectly to your phone's movements. Beyond free-play, the dashboard includes:
- A live tutorial mode to guide you through the chords
- Automated demo playback to showcase the instrument's capabilities
- The ability to record, save, and replay your jam sessions
How We Built It
Architecture
Chord-less is distributed across three core components (the iPhone, the laptop, and the active browser) that all communicate over WebSocket on a local network. Each device captures and handles unique pieces of data:
- The phone detects when and how you strum (using the iPhone's built-in IMU for tilt and acceleration).
- The laptop camera detects what chord your hand is forming.
- The browser synthesizes the actual guitar audio using Karplus-Strong synthesis, an algorithm used for creating realistic plucked-string and drum-like sounds.
The iPhone app is built in Swift, using CoreMotion for sensor accelerometer and gravity data at 60 Hz, and CoreHaptics for vibrations/tactile feedback on each strum. The server is built in Python with aiohttp for WebSocket handling, MediaPipe for hand landmark detection, and OpenCV/NumPy for frame decoding.
The laptop runs the lightweight Python server to act as the central message hub. It serves the browser dashboard as static files and maintains a single WebSocket endpoint at /ws that all clients connect to. The server inspects each incoming message's type field, updates a small shared state (current chord, last 25 strums, latest phone telemetry), and rebroadcasts to every connected client. To detect hand gestures, the browser sends a webcam snapshot, the server routes it through the MediaPipe hand tracker, and broadcasts the detected chord back out.
iPhone IMU Integration
The iPhone acts as the strum controller, which can be held like a guitar pick. It uses the iPhone's IMU (inertial measurement unit) to detect strum gestures with high fidelity.
The StrumDetector class reads CMDeviceMotion at 60 Hz and extracts two independent signals:
Strum triggers are detected from the acceleration vector. We compute the planar force magnitude across the X and Z axes (the axes that correspond to a lateral sweeping motion when the phone is held upright). Velocity is normalized from the force magnitude and capped at
1.0, giving a natural dynamic range where soft wrist flicks produce quiet strums and aggressive swings produce loud ones.Rotation tracking runs continuously and independently of strum events. The phone's tilt angle (how far it's rotated from vertical) controls strum depth: how many strings are hit. We compute this from the gravity vector:
- At 0° (phone upright), only one string is struck.
- At 90° (phone horizontal), all six strings ring out.
- This creates an expressive physical mapping: tilting your wrist wider literally opens up the chord.
Haptic Feedback
Every strum triggers a burst of CHHapticTransient events through CoreHaptics, with one tap per string hit, spaced proportionally to strum speed. A fast, full-chord downstroke produces a rapid-fire buzz of six taps across ~90ms; a gentle single-string touch is a single sharp click. This gives the player physical confirmation of both the intensity and breadth of each strum without looking at a screen.
Individual string plucks (triggered by tapping on the string visualization) fire a single maximum-intensity transient.
Computer Vision
Chord detection uses MediaPipe's Hand Landmarker model running on the laptop. The pipeline accepts frames from the browser's webcam (streamed as base64 JPEG over WebSocket) and classifies a five-finger state into one of eight guitar chords. Frames are sent at 150ms intervals, giving roughly 6-7 classifications per second, which is typically enough for chord changes during performance.
Audio Playback (Karplus-Strong Algorithm)
Audio is generated entirely in the browser using the Web Audio API with a custom implementation of Karplus-Strong synthesis, a physically-modeled string synthesis technique that produces remarkably realistic plucked-string tones from a simple delay line.
The algorithm works by:
- Filling a circular buffer with random noise (simulating the initial energy of a pluck).
- Repeatedly averaging adjacent samples with a damping coefficient as the buffer loops.
- This gradually filters out high frequencies, mimicking the way a real string's harmonics decay over time.
When a strum event arrives, the engine doesn't just play all strings simultaneously. It selects which strings to play based on the rotation value (strum depth) and sequences them with a small inter-string delay that decreases with velocity, mimicking the physical lag of a pick sweeping across strings.
Challenges We Ran Into
Taming Jittery Computer Vision
MediaPipe hand-tracking is fast but incredibly noisy. This caused our chord detections to flicker wildly. We solved this by writing a custom temporal hysteresis loop: a finger's state only registers as changed if the detection holds true for several consecutive frames. We also built a "fuzzy" nearest-neighbor matching algorithm so that if a finger is slightly occluded, the app still locks onto the correct intended chord.
Synthesizing Procedural Audio
We didn't want the easy route of playing bloated .mp3 samples. Instead, we built a mathematical synthesizer in JavaScript. Utilizing the Karplus-Strong string algorithm, we procedurally generate the sound of a pick striking a string and feed it through finely-tuned decay loops so every strum is rendered live.
Mapping IMU Physics to Dynamic Haptics
We didn't just want to detect a "shake." We used the phone's gravity vectors to calculate rotational tilt, determining exactly how "deep" your pick went into the strings. Then, we synced this math with iOS CoreHaptics, firing off rapid bursts of concentrated vibrations spaced apart by tight milliseconds so as you swing, you physically feel the "pick" dragging across individual virtual strings.
Accomplishments That We're Proud Of
We're incredibly proud of how naturally Chord-less bridges the gap between hardware and software to create a genuinely playable instrument. Instead of just a binary "on/off" trigger, we successfully turned raw phone telemetry into highly expressive motion. The app calculates strum direction, measures velocity, and factors in depth-sensitive rotation, allowing users to rake across all six virtual strings or gently pick just two.
On the machine learning side, we shipped a highly practical gesture-to-chord pipeline. By implementing smart smoothing and fuzzy logic on top of the raw MediaPipe hand landmarks, our computer vision is remarkably resilient to shifting lighting and jitter, making the detection actually usable rather than just technically functional.
Connecting all of this required us to stitch together real-time data from an iPhone, a webcam, and a Python server, all feeding into a unified browser experience. We achieved a near-instantaneous loop that feels incredibly cohesive. Finally, rather than stopping at a single tech demo, we built a polished, end-to-end product stack featuring:
- A live webcam overlay
- A structured tutorial mode
- A robust session recorder/player that outputs custom
.chordcastreplay files
Future Plans
Chord-less was built with a modular, multi-device architecture, allowing for exciting expansions. Future plans include:
- Per-Finger Fretboard Tracking: One major future improvement would be moving beyond a single hand gesture per chord and instead tracking each individual finger to simulate true fret placement. Rather than treating the left hand as a simplified chord selector, the system could map finger positions to specific strings and fret locations, much closer to how a real guitar is actually played.
- "Guitar Hero"-Style Rhythm Game: Intersecting structured tutorials, demo playback, and real-time phone/webcam telemetry to create a gamified, scrolling "beatmap" for scoring and combos based on precise chord and strum timing.
- Virtual Pedalboards and Amp Modeling: Chain Web Audio API nodes (like WaveShapers and Convolvers) to the custom-built acoustic guitar synthesizer to add digital effects (distortion, delay, reverb), transforming the tone into an electric guitar with a Marshall stack sound.
- Live Looping Stations & Multiplayer Jamming: Utilize the existing
SessionRecorderand WebSocket hub to enable live looping (record a rhythm track, loop it, and overdub solos) and support remote multiplayer jamming with multiple phones and webcams connected to a single laptop.
Built With
- aiohttp
- corehaptics
- coremotion
- css3
- html5
- javascript
- mediapipe
- opencv
- python
- swift
- websockets
Log in or sign up for Devpost to join the conversation.