Inspiration
We were inspired by how tedious it is for video editors when "scrubbing" through timelines to find the perfect beat for a cut. While many new AI tools try to solve this by generating the entire video for you, we felt this robbed creators of their artistic agency. We wanted to build a tool that acts as a digital metronome that provides editors every single beat but the freedom to choose how to cut with them.
What it does
Synchro is an audio analysis tool that instantly visualizes the rhythmic structure of any song. Users can upload an audio file to see a precise, zoomable waveform with beat markers overlaid in real-time. It features a "Live Preview" mode that demonstrates the sync accuracy by editing a stock video to the beat. It also allows creators to export a .csv marker file compatible with popular editing softwares like Adobe Premiere Pro and Final Cut Pro.
How we built it
We built Synchro using React (Vite + TypeScript) for the frontend, as well as Tailwind CSS for styling, and Python for the backend. The core logic relies on Librosa, a Python library that analyzes the audio spectral flux to detect onset strength and calculate BPM. On the frontend, we utilized Wavesurfer.js to render the interactive waveform and implemented a custom algorithm to sync the video player to the detected beats real-time.
Challenges we ran into
The biggest technical hurdle was implementing the "Live Preview" feature without causing playback stutter. We initially had issues where the video player couldn't keep up with the rapid audioprocess events from the waveform, causing it to miss cuts or lag behind the beat. We solved this by anticipating the next cut window and checking if the playback head has passed a threshold to smoothen the transitions.
Accomplishments that we're proud of
We're proud of the precision of the snap to beat detection algorithm as well as the visual UI design. We are also proud that we managed to implement a fully functional export feature in such a short time frame.
What we learned
We learned a lot about digital signal processing and how Fourier transforms are used to isolate beat onsets from raw audio data. We also gained experience in connecting a Python backend and a React frontend.
What's next for Synchro
The immediate next step is to build a direct plugin extension for professional editing softwares like Adobe Premiere Pro and DaVinci Resolve so editors can fetch beats without leaving their timeline. We also plan to implement downbeat detection to distinguish between the start of a measure, which is the one beat, and the other beats, allowing for even more musical editing suggestions.
Built With
- librosa
- numpy
- python
- react
- tailwind
- tailwindcss
- typescript
- vite
- wavesurfer.js

Log in or sign up for Devpost to join the conversation.