Welcome to Sensei Hub
Education is a vital part of empowering refugee children to survive through crisis and build better lives.
Iman completes her test from todays lesson
Teacher Tine captures Iman's test in Sensei Hub app
Sensei Hub app records the result accurately and logs it against her student profile
All the students of the class have profiles
All their results are captured and archived also
When the teacher is within wifi, the app prompts to archive to SenseiHub.online repository
The teacher can log in and see the archive of students they have taught to date
This archive keeps clear and up to date records of each students progress
2016 was a year of so much shake up, we all need to do what we can to give back and add our skills to the global issues around us. A global crisis like the refugee displacement is undeniable and something we wanted to put our energy into for this hackathon. Education is crucial in empowering and enabling people to make their way in life. When we learnt first hand from the Techfugee team that the teachers of refugee children were struggling to centralise, track and measure their transitory students progress, we wanted to tackle this near and present problem.
What it does
Sensei Hub provides a simple capture mobile app that photographs and records student test papers. Computer vision (powered by IBM Watson) understands the test paper results, and instantly records this against the student and test IDs from the one photograph. A facial recognition feature is also available where the student doesn't have their student ID. This is stored locally in the teachers own mobile device until they are within wifi and an upload to the SenseiHub.online repository can be completed and the test data securely archived.
How we built it
We've split the team into mobile and backend and started building out the app. The mobile app was built with native Android code, using open source libraries wherever possible. We've used retrofit for communicating with the API and gson for parsing the responses into POJO objects. We took the simple approach for image capture by using the native camera app. After image capture we uploaded the images to S3 for easy access for the API. Image recognition was done on the backend, which communicates with the IBM Watson API for image analysis. We've preloaded the Watson api with image training sets or correct/wrong so the AI has something to play with.
Challenges we ran into
- we've had some tech challenges using IBM Watson, where we had to retry the api calls several times with a few timeouts
- as well some difficulties merging all the code together.
- facial recognition proved to be less accurate then expected
Accomplishments that we're proud of
The inclusion of facial recognition adds another layer of identity support, especially if students have been moving around and might be jumping in and out of different classes, with different teachers, in different locations. We would love to contribute to being able to sync up these displaced kids - so that their education can follow them where ever they are!
What we learned
- we learned to work as team,
- work and hard and have fun through all night.
- some new tech insight into IBM Watson, Amazon Alexa and Cisco.
What's next for Sensei Hub
We are excited to support the implementation of this mobile app and online repository for use in the field.