Inspiration

One of our group members, Richie, has always wanted total creative freedom, being able to draw and immerse into the world around themselves. This idea didn’t bring itself into reality as Richie was always controlled by a set size of screens or canvas, stopping him from expressing himself fully. Traditional art paintings or drawings are also very hard to present to other people, as they must be physically displayed or hung. This can be heavy to use or be at risk of damage. DrawScape is Richie's concept of total creative freedom.

What it does

Drawscapes allows you to create whatever you want no matter how much space you are given (E.g: busy tourist sites, etc) and no matter the location (E.g: Express your artist talents in places where you are not normally allowed to draw near). The mobile Android app allows you to bring your hand in front of your camera and draw with your finger tips with the art being 3D. You can also collaborate with other people on art as they are able to see and work on your art with your permission.

How we built it

We built the application as a native Android augmented reality platform focused on real-time hand-tracked AR drawing, cloud-connected user profiles, and responsive modern mobile UI.

Development began with the Android/Kotlin stack to ensure direct access to device hardware, ARCore APIs, and optimized mobile performance. The application was structured using a modular Jetpack Compose architecture, allowing us to build reactive, state-driven screens for authentication, AR drawing, and user profile management while maintaining clean separation of concerns.

For augmented reality rendering, we integrated ARCore through SceneView/ARSceneView. This provided environmental tracking, camera pose estimation, and 3D scene management necessary for placing and rendering virtual drawing content in real-world space.

To enable gesture-based drawing, we incorporated Google MediaPipe Tasks Vision for real-time hand tracking. Camera frames are processed locally on-device, where MediaPipe detects hand landmarks and translates finger movement into spatial drawing input within the AR scene. This pipeline allows users to draw in 3D space using natural hand gestures without requiring external hardware.

User authentication and cloud data storage were implemented with Firebase. Firebase Authentication manages secure login and account creation, while Firestore stores user profile data, drawing metadata, and application state in the cloud for persistence across sessions.

Dependency injection was handled using Hilt to manage repositories, services, and application-level dependencies cleanly across the codebase. Kotlin Coroutines were used throughout the project to support asynchronous operations such as Firebase requests, camera processing, and ML inference without blocking the UI thread.

The application was developed and tested in Android Studio using Gradle for dependency/build management. Iterative testing was performed on Android emulators and physical ARCore-compatible devices to validate AR tracking accuracy, hand-tracking responsiveness, and UI behavior under real-world conditions.

Overall, the system combines native Android development, real-time computer vision, augmented reality rendering, and cloud services into a unified mobile application designed for responsive and immersive AR interaction.

Challenges we ran into

  • AR and camera integration (we spent at least 90% of the hackathon time trying to make these features not crash our application or even just allow it to run in the first place!) Configuring dependencies is a nightmare we would not wish upon anybody.
  • Learning Kotlin and Android development–neither Hanze nor I (Jadon) had even worked on an Android application before, and it made the experience incredibly stressful, especially because we needed to learn entirely new frameworks in a language we had never seen.
  • The development of the main application’s user interface (UI) was incredibly difficult, as we had never utilized Jetpack Compose beforehand.
  • We also ran into challenges when designing the landing page. At first, we felt confident when trying to recreate historical landmarks using three.js, but found it extremely challenging due to the lack of quality. After trial and error over the night, experimenting with things such as lighting and other designs, we decided to use CC-0 GLB files as the references for historical landmarks, as they had much higher quality in terms of design than the ones we could create by hand.

Accomplishments that we're proud of

It is the team’s first time dealing with Android development and integration. Debugging was one of our main issues that we have spent hours on. The developer got more proficient at the skill following hours of practice, this allowed us to be more efficient in the end result in a build success.

What we learned

  • Learned how to integrate AR and camera functionality in a mobile app. Learned how to create android mobile apps using languages like Kotlin and the IDE Android Studio
  • Mastered three.js for the landing website that allowed us to transport viewers into different locations of the world, showing how drawscape is for everybody.
  • How to successfully scope and pitch a product, showing how collaboration can make a diverse set of ideas that can benefit in everyone’s perspectives.

What's next for DrawScape AR

DrawScape AR's next steps are to increase engagement and expand our target audience while also expanding to other genres. DrawScape's main target audience are travellers or tourists who love writing or leaving mementos on landmarks; street artists, who enjoy creating graffiti on buildings that otherwise would be illegal, like historical sites; companies that want to leave advertisements in popular destinations like Times Square, where it would otherwise be more expensive for billboards, etc; organizers of special events, where the app can be used for special effects, augmented experiences and games, for example, local artists who couldn’t afford lighting effects and such can use Drawscape for augmented reality 3D-generated effects instead. DrawScape also wishes to expand to genres like music where rather than images, new music artists trying to gather a base following can upload their songs and name tag onto streets for a cost where passersby are allowed to hear the artist's music as they walk by and scan the globe to follow them on websites like Spotify.

Built With

Share this project:

Updates