Inspiration
Inspired by work on firmware and a joke about "AI-everything", like compilers.
What it does
Compiles and debloats C projects by intelligently generating missing dependencies only where needed with ehanced context and research abilities.
How we built it
We coupled Gemini API with MinGW and Brave Search web retrieval in a scalable Docker container. MinGW will attempt to compile the file, and upon failure, will pass code and errors to Gemini. Gemini will iteratively attempt to gain additional information through the internet, fix the code, and recompile the file. The retrieval system works by leveraging the language abilities of the LLM to create search queries, select relevant websites, visit the websites, synthesize results, and use the results to inform better code fixes before recompilation.
Challenges we ran into
A major challenge was balancing scalability with the limited amount of time present in the hackathon. Docker usage was a sticking point, as some team members were using Docker for the first time, which caused brief delays. An ideal project would use micro-services and asynchronous requests, but we chose single threaded, synchronous programming due to time constraints. Certain features like finite-state-machine(FSM)-based decision-making for the retrieval system were delayed in favor of reaching a minimum-viable solution as proof of concept.
Accomplishments that we're proud of
We are proud to have created an AI programming/compiling agent with web retrieval. We are especially happy that it is expandable in many ways, both in processing speed, and project scope.
What we learned
We learned how to set up AI agents and API communication between services.
What's next for buildmate
We would add a vector database for improved local retrieval and building context for larger projects. We would also convert the code to fit an asynchronous, microservice oriented architecture. We would improve LLM performance using LoRA.

Log in or sign up for Devpost to join the conversation.