Inspiration

I was inspired by the emotional attachement people were able to create with Tamagachi pets and wanted to recreate a modern version of the pets in augmented reality. Unlike most AR Apps, Airpets, accesses the AR Camera feed to give the pet "sight"

What it does

You can:

  • Feed the pet batteries to charge him
  • Make him dance
  • If a bag appears in the camera feed, he will compliment your bag and wear one of his own
  • He can telepathically play fetch
  • You can customize his hats
  • He can follow you around in the real world
  • You can have converstations with him

Nuances:

  • Dynamic lighting and shaders adapt to reflect real-world lighting.
  • He always turns around and looks up at you
  • Upon login, the pet runs up to you and gets excited to see you again
  • There are various achievements that can be unlocked e.g. charge up the robot once

How I built it

I used Swift, Objective-C, Photoshop, Mixamo, ARKit3, and TensorflowLite

Accomplishments that I'm proud of

Using an AI YOLO model to allow the pet to see real world objects is the feature I'm most proud of. I love that I could add something that made the pet more realistic ... more like an actual pet in the real world. To my knowledge it's an AR feature unique to this app.

What's Next

In the future, mutople people can spawn their pets in the same AR space. Both owners can see both pets and the pets can interact with each other

You will also be able to choose between different pet avatars and customize them so everyone can have a unique pet.

The pet will react to a wider variety of objects. E.g. sit in real world chairs, start conversations based on real-world objects. e.g. sees an xbox controller and asks you what your favourite video games are.

An AI chabot to make the robot's conversation for dynamic and human. I want to make it sound like you're talking to a real person. The robot's conversation will reflect whatever you say to it.

Built With

Share this project:

Updates