Inspiration

Our project was inspired by our combined curiosity about climate change and a strong interest in augmented reality (AR) technology. We aimed to bridge our knowledge gaps regarding environmental sustainability by creating an interactive tool that visualizes how the physical spaces around us can be optimized for educational purposes and environmental improvement. The goal was to make learning about and contributing to environmental sustainability engaging and accessible through AR.

What it does

Our project overlays digital representations of various vegetations onto the real world using augmented reality technology. When users view their surroundings through the AR interface, they can see digital assets of trees and plants positioned in their actual physical space. Additionally, the application displays detailed information about the environmental benefits of cultivating the specific trees or plants shown. This helps users understand the positive impact of different vegetations on their surroundings.

How we built it

The Interface

We utilized XREAL glasses and the Unity editor to develop an Android application capable of displaying 3D models of digital assets. The interface allows users to interact with and explore these models within their real-world environment.

The Text-to-Texture Pipeline

We employed two models from Hugging Face to transform text prompts into digital assets. First, we used Dall-E 3 to convert text descriptions into images. Then, we applied TripoSR to enhance these images, turning them into textured assets suitable for placement in AR space. This pipeline ensures that the digital representations are both accurate and visually appealing.

The Information Pipeline

We integrated OpenAI's GPT-4 model to process text prompts and generate relevant information about each tree or plant. This information includes the environmental benefits of planting the vegetation, tailored to the specific location and context provided in the prompt. This feature educates users on how particular plants can improve their environment.

Challenges we ran into

We encountered several challenges throughout the development process. One significant hurdle was selecting and configuring different cloud services to host our pipelines. Additionally, integrating the NRSDK with Unity to create a seamless AR experience proved complex. We also faced latency issues due to the resource-intensive nature of the models we used, which impacted the responsiveness of the pipelines.

Accomplishments that we're proud of

We are proud of successfully getting our pipelines to function and improving the interface's performance, even though it is not yet fully optimized. Overcoming the technical challenges and seeing our concept come to life has been a significant achievement.

What we learned

Through this project, we gained valuable insights into deploying machine learning models and implementing plane detection in augmented reality. These skills are crucial for further advancements in AR and AI for Climate Tech.

What's next for GreenSpace

Looking ahead, we aim to develop a comprehensive application that serves as a digital storefront for plants. This platform will allow users to create digital assets in real-time and gain a deeper understanding of how to utilize their environment more effectively. Beyond adding plants, the application will offer suggestions on transforming personal spaces into more efficient systems that reuse resources and reduce waste. Our ultimate goal is to empower users to make informed decisions that benefit both their immediate surroundings and the broader environment.

Share this project:

Updates