Inspiration

Our inspiration came from bonsai trees and plant behavior. Plants don’t communicate with words—they communicate through subtle physical changes like wilting, blooming, growing, or swaying. Those visual signals reflect the plant’s health and environment.

We connected this idea to human body language. During interviews, presentations, and meetings, people also communicate a lot without speaking—through posture, facial expressions, eye contact, and hand movements.

What it does

BONS.AI is a web/desktop application that helps people improve their presentation and interview skills by analyzing body language in real time. Using your webcam, the system tracks: Facial expressions Posture Eye direction (looking down vs attentive) Hand movement

The app provides sentiment and engagement feedback and visualizes it through a plant avatar that reacts to your behavior: Good posture → strong upright stem Smiling → flowers bloom Confidence/engagement → plant grows Slouching or looking down → plant wilts or droops

Users can also review past sessions through a timeline of their plant’s growth, showing moments where posture or attentiveness dropped. The goal is to help people practice interviews, presentations, or meetings and improve their nonverbal communication skills.

How we built it

We used Figma Design to initially design it but then we input it into Figma Make to implement the webcam tracking.

Challenges we ran into

One of the main challenges we faced was understanding the capabilities and limitations of the tools we were using, especially Figma Make and webcam tracking technology. Since Figma is primarily a design and prototyping tool, it took time to figure out how far we could push it when simulating interactive elements like the real-time plant visualization reacting to body language.

Accomplishments that we're proud of

One of our biggest accomplishments is creating a system that turns abstract body language feedback into something visual and intuitive.

Instead of charts or numbers, users see a living plant that reacts to them in real time, which makes feedback easier to understand and more engaging. Especially given that we were only two people that completed this in 6 hours total.

What we learned

Through building BONS.AI, we learned that nonverbal communication plays a huge role in how people present themselves.

We also learned how powerful visual metaphors can be for feedback. A plant wilting or blooming communicates improvement or disengagement more naturally than numbers or scores.

Technically, we learned how to combine computer vision, behavioral analysis, and interactive design to create a system that feels both analytical and expressive.

What's next for BONS.AINext, we want to expand BONS.AI by adding more features, including:

More detailed plant ecosystems that evolve as users improve their communication skills

Advanced meeting analytics that highlight specific moments where posture or engagement changed

Better integrations with video platforms to allow seamless practice during real or simulated meetings

Personalized coaching suggestions based on patterns in body language

Built With

  • figma
Share this project:

Updates