Inspiration
Goosiance was inspired by the desire to deepen viewer engagement in movies and visual entertainment. As technology evolves, immersive experiences have become a priority, and we saw an opportunity to combine emotion and visual technology. We wanted to take it a step further by allowing viewers not just to watch but to feel the emotional tone of the scenes, enhancing the overall experience.
What it does
Goosiance is an LED ambient lighting system that dynamically adjusts lighting based on the emotional tone of a scene in real-time. It analyzes implements an algorithm to transform movies into corresponding lighting effects, immersing the viewer in the atmosphere of the movie, show, or video.
How we built it
We built Goosiance using a combination of emotion-detection algorithms and embedded LED technology. The system captures emotional cues from the content, interprets them using AI-based analysis, and then triggers LED lights to change according to the emotional tone. The embedded system is designed for seamless integration with home theaters and other visual setups, ensuring the lighting is synchronized with the media being displayed.
Challenges we ran into
One of the main challenges we faced was accurately detecting and interpreting emotions from visual content in real-time. Ensuring that the LED system responded quickly and precisely to changes in the scene without lag was another hurdle. Additionally, linking the front end and back end work was incredibly difficult.
Accomplishments that we're proud of
We’re proud of successfully creating a system that delivers a seamless immersive experience. The LED lights change in perfect harmony with the emotional shifts in the scenes, adding a new layer of depth to the viewing experience. A major accomplishment was the successful integration of both front-end and back-end work along with hardware components, ensuring everything works cohesively and efficiently.
What's next for Goosiance
Looking forward, we plan to expand Goosiance by refining the emotion detection algorithms for even more nuanced responses. We also aim to explore integrating additional sensory feedback, such as sound or motion, to enhance the immersive experience further. Additionally, we’re working on making the system even more customizable, allowing users to fine-tune their experiences for specific types of content or personal preferences.
Built With
- arduino
- c++
- flask
- next.js
- python
- tailwindcss
- typescript
Log in or sign up for Devpost to join the conversation.