In the last half-century, humans have come up with an incredible amount of ways to interact with technology and each other, either online or in person, that we take for granted the amount of control we have over the way the world is presented to us. From chatrooms to VR conferences, from our cell phones to our cars, it takes more than two hands to count the average amount of times we interface with our technology, or with people by way of technology, in a day. However, something that has yet to be touched by the technological and cultural revolution is how we interact with the spaces around us. Buildings, rooms, and hallways are just means to an end. Safe transit from point A to point B. Shelter. Places for gathering. There is absolutely no reason that we shouldn't be able to interact with our surroundings the same way we interact with our phones, and no reason that by facilitating this, we can't gain just as much as we did when the smartphone became the norm. What Jetstream does is transform an event space into an interactive experience. Sitting in the middle of the room, it uses computer vision to find the largest group of people and launches paper airplanes at them in random intervals. By just looking at the accumulation of paper airplanes after an event, one can track the movement of groups over its course. The airplane color can even be changed over the course of the event, allowing a quantitative look into crowd dynamics and creating valuable data in a nonintrusive way. This also creates an innovative way to disseminate information. Imagine you’re at an art exhibit and you get hit with a paper airplane. If there’s some sort of writing on it, there’s no way you don’t pick it up and read it. These are just a few of the countless possibilities. Jetstream and products like it have the potential to transform the way we go about our lives by allowing us to interact with the spaces around us.
What it does
Ideally: The Jetson board rotates a platform containing a webcam and a paper plane cannon. It takes a few photos as it scans the room, and then performs object recognition on these photos. It looks for all objects containing a search term and counts them up. In our case we chose "person" to locate the most people in any given section of the room. Then the Jetson chooses the section with the most people and launches the paper airplane in that direction.
Actually: The Jetson takes a photo and counts the number of people in it. It can take in any search term and count the number of those objects found in the image. Separately, two wheels are mounted on a rotating platform and can propel a paper airplane forward a little bit. The whole thing is situated in a cool metal structure that would be an ideal centerpiece for any room.
How we built it
Blood, sweat, and tears.
Challenges we ran into
The connection to the Jetson and the image processing went fairly smoothly. We ran into a lot of issues with motors and the airplane mechanism. The only wheels we could find were not smooth but treaded, and there were no more fasteners to attach things to. We got the wheels to spin, but there was too much friction to send the airplane flying - it just kind of sinks to the ground a little sadly.
In addition, we ran into a fair amount of issues with the servo to rotate the project. We first had a burned out one, then weren't properly signaling it to rotate. Eventually we ran out of time to fix the issues.
Accomplishments that we're proud of
We did get a running system on the Jetson that can take photos and analyze them using a pre-trained model to do object detection, taking advantage of the Jetson's GPU capabilities to do the processing incredibly fast. We then take a desired search term and count up the number of occurrences in the photo.
We're also really proud of the mechanical aspects of our system. Although there were issues in combining everything, we had a solid container and stand for our setup, as well as platforms for rotation and a block to angle the airplane thrower.
What we learned
We all learned a whole lot from this project. Our team consisted of a Mech-E, an EE, and a CS major, but each of us worked across our disciplines to tackle issues we encountered in building this project. We learned about computer vision and controlling motors and the importances and differences of various power supplies. We also learned about mechanical design and friction and integrating mechanical and software components.
What's next for JetStream
Fold the paper airplanes.