Inspiration

Technology has currently been perceived as a bottleneck for children to go to the playground. Our intention is to use technology to attract children to interact with the environment and gain knowledge as well. Our project is aimed at the kids to view the world through the kaleidoscope of our app.

What it does

The app enables the kids to

  • Learn day to day objects by capturing them.
  • Play in the parent defined outdoor environment.
  • Voice out the pictures shown to them.

How we built it

  • We used the in-built CoreML framework to detect objects being scanned.
  • AR kit and Apple location APIs were used to navigate the kids to the destination listed by parents.
  • Speech recognition has been used to transcribe the voice to text.

Challenges we ran into

  • The AR concepts took a lot of time to understand.
  • The values of the Euclid angles to display objects were confusing.

Accomplishments that we're proud of

  • Built a minimum viable product in less than 24 hours.
  • We used open-source SDKs instead of different third party SDKs to integrate with the app.

  • Didn't sleep the entire night and still rocking :P

  • One of our team members finishing 3 red bulls, 2 monster drinks, 2 cokes, and 6 water bottles to sustain the sleepless night.

What we learned

  • Concepts of AR kit.
  • Business Canvas Model.

What's next for Kaleidoscope

  • Create a web app for the parents to view the progress of their kids and to locate them using estimote beacons.
  • A feature to promote the interaction between kids and their surroundings. Ex: Asking kids to scan the properties that are there in their room.

Built With

Share this project:
×

Updates