Inspiration

The project draws inspiration from the fusion of art, music, and technology, exemplified by Jean-Michel Jarre's innovative laser harp, which merges traditional musical elements with cutting-edge technology. It is my 3rd Hackathon and I wanted to do something hardware.

What it does

The project uses computer vision to track red objects on a screen and trigger corresponding musical notes. It leverages OpenCV for real-time object detection, Pygame for sound playback, and a dynamic red detection algorithm to adapt to varying lighting. The object tracker assigns each red object to a note, playing it when detected. Data is logged and saved in a CSV file for playback via a GUI. The system uses a hand-tracking feature for volume control and integrates an interactive interface for sound playback options. It automates the detection, sound playback, and recording process, offering a complete interactive experience.

How we built it

A box of scraps from B and Q and a glue gun, tape, and lasers. The code is written in python. Object Detection: Used OpenCV with HSV colour space and CLAHE for red object tracking, ensuring it worked in varying lighting conditions.

Sound Integration: Integrated Pygame to play musical notes based on detected red objects, mapping each object to specific sound coordinates.

Gesture Control: Leveraged MediaPipe for hand tracking to control volume – closer hand = quieter, farther = louder.

Recording & Playback: Implemented CSV recording to log notes and timestamps, allowing users to replay their sessions.

Real-Time Performance: Ensured smooth, lag-free performance using multithreading to handle object detection, sound playback, and hand tracking simultaneously.

Challenges we ran into

The camera take 4 minutes to start so it was difficult to test it. Dynamic Lighting for Laser Detection: Getting the right settings to detect the laser accurately in changing lighting conditions took a lot of time and experimentation. Fine-tuning the HSV range and using adaptive histogram equalisation helped, but it was still a challenge to get consistent results across different environments.

Accomplishments that we're proud of

Real-Time Sound Synchronisation: I successfully synchronised red object detection with musical note playback in real time, providing a seamless interactive experience. Gesture Volume Control: The hand tracking feature, which controls the volume based on hand position, turned out to be intuitive and effective.

What we learned

Multithreading & Performance Optimisation: Handling multiple processes like detection, audio playback, and gesture recognition at the same time taught us a lot about multithreading and real-time performance optimisation. Fine-Tuning Detection Algorithms: The importance of tuning detection algorithms to work under different lighting conditions became clear, requiring both hardware and software adjustments.

What's next for Laser Music Machine

MORE INSTRUMENTS OFCOURSE

Built With

Share this project:

Updates