Inspiration
Landscaping shouldn't be a guessing game. Every year, homeowners waste millions of gallons of water on lawns and plants that aren't meant to survive in their local environment. According to the Environmental Protection Agency (EPI), up to 70% of residential water use in arid regions goes to outdoor irrigation, representing a massive opportunity for conservation.
We realized that there is a massive gap between property value and environmental health, most people think they have to choose one or the other. However, sustainable properties actually reduce utility bills and maintenance costs by 25-50% in energy and 10-40% in water.
We built GreenScape to prove you can have both. By combining real-time climate data with AI-driven plant science, we wanted to give everyone the tools to turn their backyard into a sustainable asset. Our inspiration was simple: make it effortless for anyone to build a yard that saves money, saves water, and grows their property value.
What it does
GreenScape is a personal environmental consultant in your pocket. By simply snapping a photo of your yard, the app uses a custom AI pipeline to analyze your specific soil conditions and cross-references them with local climate data and USDA hardiness zones.
It doesn't just suggest plants; it builds a data-driven landscape strategy. Users can:
Visualize in AR: Use an immersive Augmented Reality interface to "test-drive" 3D plant models in their actual space.
Track Impact: See real-time metrics for every plant placed, including CO2 sequestration, water savings, and nitrogen-fixing potential.
Estimate Value: View a projected increase in home property value (ranging from 3-20%+) based on the density and quality of the landscaping.
How we built it
GreenScape is a full-stack mobile application built with Expo (React Native). The core intelligence lies in a custom RAG (Retrieval-Augmented Generation) pipeline:
- Data Engineering: We scraped open-source plant datasets from Kaggle and generated vector embeddings using
text-embedding-ada-002. - Vector Database: These embeddings are stored in Supabase using
pgvector, allowing for semantic search of native, low-water, and soil-friendly species. - Dynamic Context: Static information is gathered about environment with VisionAPI. The AI pulls real-time climate data and multi-day forecasts from the Open-Meteo API to dynamically prioritize sustainable plant choices.
- The Backend: Logic is handled via Supabase Edge Functions for low-latency AI responses.
- AR Visualization: We integrated ARKit to create an immersive AR experience where users can move, scale, and "multi-plant" an infinite variety of 3D assets.
- Sustainability Score: We implemented a scenario-driven overall deterministic weighted-matrix score (0-100) alongside metrics for CO2 sequestration, water saving, heat reduction, and nitrogen-fixing to help homeowners assess their climate impact.
Challenges we ran into
Our biggest challenge was implementing AR. We initially spent over 10 hours developing for Android to avoid bulky dependencies and lengthy iOS verification processes, only to find that hardware fragmentation made the AR packages unstable. This forced a mid-project pivot to iOS, requiring us to rewrite core code, configure new environment dependencies, and navigate strict iOS permissions. Once the app was running, we manually tuned and optimized our 3D models to ensure they rendered smoothly across different iPhone generations without crashing.
Additionally, we faced the common hurdle of conflicting merge requests. Since this was our first time tackling mobile development as a team, we had to rapidly master version control and collaborative debugging within a React Native environment to keep the project moving. These challenges definitely pushed us to become more adaptable developers and tighter as a team!
Accomplishments that we're proud of
AR and mobile development were new to our team, and we're proud of our ability to implement them so seamlessly. We're also happy about the technical synergy between our AI backend and the AR frontend, creating a super smooth user flow. Lastly, we're really proud to have engineered a high-performance RAG pipeline that weaved together distinct AI components to deliver highly accurate, environmentally conscious landscaping strategies directly to the user's screen.
What we learned
We learned a lot about mobile app development, different AI functionalities, and how augmented reality can be implemented. We also learned a lot about each other as the four of us were from 3 different schools and came form very different backgrounds.
What's next for GreenScape
In the future, we hope to optimize the application so it can give results even faster and more accurate than it is now. Some improvements we would implement if we had more time are training custom machine learning models to improve soil classification accuracy and introduce advanced personalization features, allowing users to tailor results to their specific aesthetic and functional preferences.
Built With
- arcore
- arkit
- expo-router
- expo.io
- gpt-5.2
- javascript
- open-mateo-api
- pgvector
- postgresql
- react-native
- sql
- supabase
- tailwind-css
- text-embedding-ada-002
- typescript
- viroreact
- zustand
Log in or sign up for Devpost to join the conversation.