About the Project
Inspiration
Creating an easily accessible virtual learning environment to teach STEM concepts to young students.
Students all around the world are unable to get the learning tools they need due to financial constraints or learning difficulties that make physical, expensive learning tools hard to obtain for many parents and their children. This kind of system perpetuates wealth and racial inequality in the United States and around the world. We wanted to be able to use VR to attempt to bridge this gap by providing everyone with the best possible learning solution.
What it does
Uses hand-tracking to manipulate objects in augmented reality to teach STEM concepts.
EduVR is an app that creates a virtual learning environment to teach concepts in STEM to young students. The app would contain multiple stations, each with a different subject (such as chemistry, physics, math, electronics, and more), that will allow users to interact with objects generated in augmented reality. The app uses hand-tracking through computer vision to move and manipulate the objects. For this station, the user is able to do things such as assemble a valid circuit using basic components.
How we built it
Utilizing openCV and python integrated with unity to accomplish hand-tracking through single POV computer vision.
We started by experimenting with MR technology– trying to understand the Unity interface, using libraries such as MediaPipe in Python and ARkit in C#. After honing in on the best library for our use case, we developed scripts that interface with Unity’s C# to utilize both Python’s swift development feature, and the graphics of Unity. Then, with much debugging and testing, we focused on hand-tracking technology and a game demo that would introduce young kids to STEM concepts.
Challenges we ran into
Multiple pivots during ideation. Mobile AR libraries for high resolution hand tracking difficult to implement (ARCore limitations and ARKit confinements).
Initially, we had been trying to work on an ARKit/ARCore solution, that performed all the processing locally on the phone. We ran into a lot of challenges with learning how to use Unity since many of our team members didn’t have experience with Unity, and the learning curve was pretty steep for a short hackathon.
We pivoted to a webcam solution hosted on a computer which was more tenable, and tried many different ways of trying to be able to stream or otherwise host the app from a phone to get that native mixed reality experience, but there are genuinely no simple answers to streaming, analyzing, and returning high definition video playback. We attempted to make a website that collected a user’s camera feed and periodically uploaded pieces of video to the backend client which could resolve intent with openCV, but the latency was much too high to be feasible and not very intuitive.
Accomplishments that we're proud of
Easy access to high resolution hand tracking with just a single camera. Providing an equivalent experience to full headsets with just a device available in all households. Python and unity integration.
Having had little to no AR/VR experience, working with tools like Unity, and languages like C sharp, we are very proud to have learned so much about the AR/VR development lifecycle and the different tools that can be used to achieve cool effects for users. The potential we realized was great and we worked hard on various avenues to solve different kinds of problems. We are proud to have come up with a working solution for the problem that is virtual puzzle and problem solving that, in a perfect world would require zero setup and very little resources.
We are especially proud of the hand tracking aspect of our app, as we worked hard to try to make our app stand out in that we do not rely on expensive controllers typically used in VR headsets. This is because accessibility was really important to us and our mission of providing important learning experiences to children all around the world could not be realized without something that couldn’t eventually be run on a mobile phone, without the use of things like Oculus controllers.
What we learned
Gained insights into the challenges of creating dynamic virtual models and systems for interaction.
Although experienced with software methods and some programming language, the biggest learning curve was cross-platform development. On one hand, having numerous platforms iOS, Android, Python libraries that aren’t support in Unity C#, and Windows or Macs forced us as a team to collaborate well and properly delegate tasks. Thus, project management became a large part of our progression as each team member needed to keep in mind the bigger picture of how our team was moving forward as a collective. On the other hand, mobile-app development proved difficult since processing speeds and python interpreters are not as readily available on such devices of scale. Working with mobile-dev in parallel to python/computer-dev proved unmanageable (in the time we had), but facilitated some pretty fruitful brainstorming discussions and deep dives into streaming methodologies, code porting techniques, and even some compression methods that, overall, definitely taught us a lot about effective teamwork, time allocation, and multi-user code bases. And, frankly, having fun with it! (XR is pretty cool stuff.
What's next for EduVR
Increasing mobile integration and improving on the efficiency of hand tracking. Scaling the app to contain a whole curriculum of interactive concepts and incorporating non-STEM stations.
Next steps for EduVR is mobile development. We spent a good bit of time fleshing out prototypes and potential solutions to porting our hand-tracking technology to a mobile app. With more time an resources, we hope to make this a reality, focusing on enabling a more accessible way to easily create and use MR applications in a world where MR/VR/XR/AR experiences are becoming foundational, and WILL be revolutionary in education.
Log in or sign up for Devpost to join the conversation.