When we study physics, we utilize numerous instruments to quantify our observations about the nature and properties of matter and energy. One instrument we use is an ultrasonic sensor to gage the distance from a sensor to an object. Unfortunately, ready to use ultrasonic sensors for measuring distance, velocity, and acceleration are very expensive and not readily available in classrooms. In a more enriched high school physics class, there can still be only 4 sensors to be shared among 31 students!
To solve this dilemma, we decided to create VisionMotion. It serves as an alternative to using ultrasonic sensors, instead using the camera on your phone to do the same thing: find distance, velocity, and acceleration. We also have taken a recent interest in experimenting with OpenCV, and this was the perfect opportunity.
What it does
The VisionMotion app was built with the purpose of measuring kinematic motion of objects over a duration of time. The application utilizes the processing of camera frames and ratios after prompting for user input (the largest dimension of the measured object) to calculate and output downloadable displacement, velocity, and acceleration graphs along with spreadsheets.
The flow of the app is as follows:
1 - Open app and see landing page
2 - Enter the length of the object to track
3 - Calibrate the camera to detect the object
4 - Begin collecting data
5 - Display data on graphs
6 - Download data as CSV
How we built it
VisionMotion was built in Android Studio, programmed in Java, and relies on OpenCV and OpenGL to do computer vision image processing and draw graphs of the data respectively.
OpenCV takes in camera frames at a certain FPS rate in the form of a Matrix. We then manipulated the matrices using the OpenCV API to find the object, filter out noise, and encircle the object. It’s a complicated process but we were able to tackle it by dividing it into smaller tasks.
Challenges we ran into
One of the major problems we ran into was the inconsistency of the camera’s frames per second rate. When trying to graph and derivate the position to get the velocity and acceleration from the position, we could not use purely the values collected as sometimes the delay was as long as 0.1 s and as short as 0.05 s. We were able to overcome this challenge by interpolating our data and using a curve of best fit, but this was hard to implement as it used very advanced mathematics concepts.
Accomplishments that we're proud of
Our group is proud of completing a demo program for our objective idea. We are primarily proud of this due to the potential improvement to learning environments around the world. With the capability and accessibility of the phone; VisionMotion can be used in place of standard physics equipment such as LoggerPro and ultrasonic sensors. With this phone app; indiscriminate of geographical region and finance users around the world will be available to enhance their learning environment.
What we learned
This was our first time working with Android Studio and OpenCV, and we learned a lot about using Activities and manipulating Matrices through getting a working application up.
Our group also learned many skills this weekend from time management, to teamwork and collaboration. The exhilarating experience at THacks 2 this weekend exposed the four of us to a not so well known environment where we were constantly improvising, learning, and improving.
What's next for VisionMotion
The future of VisionMotion is endless; our future goal is to replace standard high school physics lab equipment using the built in capabilities of the standard smartphone. Our first improvement in mind are measurement accuracy and human factor accommodations such as; ergonomic positioning of buttons and time optimization. Thereafter we aim to measure values outside of the realm of kinematics such as light waves, sound, and dynamics.