Inspiration

I got this idea because of the current hurricane Milton causing devastation across Florida. The inspiration behind Autonomous AI Society stems from the need for faster, more efficient, and autonomous systems that can make critical decisions during disaster situations. With multiple sponsors like Fetch.ai, Groq, Deepgram, Hyperbolic, and Vapi providing powerful tools, I envisioned an intelligent system of AI agents capable of handling a disaster response chain—from analyzing distress calls to dispatching drones and contacting rescue teams. The goal was to build an AI-driven solution that can streamline emergency responses, save lives, and minimize risks.

What it does

Autonomous AI Society is a fully autonomous multi-agent system that performs disaster response tasks in the following workflow:

  1. Distress Call Analysis: The system first analyzes distress calls using Deepgram for speech-to-text and Hume AI to score distress levels. Based on the analysis, the agent identifies the most urgent calls and the city.

  2. Drone Dispatch: The distress analyzer agent communicates with the drone agent (built using Fetch.ai) to dispatch drones to specific locations, assisting with flood and rescue operations.

  3. Human Detection: Drones capture aerial images, which are analyzed by the human detection agent using Hyperbolic's LLaMA Vision model to detect humans in distress. The agent provides a description and coordinates.

  4. Priority-Based Action: The drone results are displayed on a dashboard, ranked based on priority using Groq. Higher priority areas receive faster dispatches, and this is determined dynamically.

  5. Rescue Call: The final agent, built using Vapi, places an emergency call to the rescue team. It uses instructions generated by Hyperbolic’s text model to give precise directions based on the detected individuals and their location.

How I built it

The system consists of five agents, all built using Fetch.ai’s framework, allowing them to interact autonomously and make real-time decisions:

  • Request-sender agent sends the initial requests.
  • Distress analyzer agent uses Hume AI to analyze calls and Groq to generate dramatic messages.
  • Drone agent dispatches drones to designated areas based on the distress score.
  • Human detection agent uses Hyperbolic’s LLaMA Vision to process images and detect humans in danger.
  • Call rescue agent sends audio instructions using Deepgram’s TTS and Vapi for automated phone calls.

Challenges I ran into

  • Simulating a drone movement on florida map: The lat_lon_to_pixel function converts latitude and longitude coordinates to pixel positions on the screen. The drone starts at the center of Florida. Its movement is calculated using trigonometry. The angle to the target city is calculated using math.atan2. The drone moves towards the target using sin and cos functions.This allows placing cities and the drone accurately on the map.

  • Callibrating the map to right coordinates: I had manually experiment with increasing and decreasing the coordinates to fit them at right spots on the florida map.

  • Coordinating AI agents: Getting agents to communicate effectively while working autonomously was a challenge.

  • Handling dynamic priorities: Ensuring real-time analysis and updating the priority of drone dispatch based on Groq's risk assessment was tricky.

  • Integration of multiple APIs: Each sponsor's tools had specific nuances, and integrating all of them smoothly, especially with Fetch.ai, required careful handling.

Accomplishments that I am proud of

  • Successfully built an end-to-end autonomous system where AI agents can make intelligent decisions during a disaster, from distress call analysis to rescue actions.
  • Integrated cutting-edge technologies like Fetch.ai, Groq, Hyperbolic, Deepgram, and Vapi in a single project to create a highly functional and real-time response system.

What I learned

  • AI for disaster response: Building systems that leverage multimodal AI agents can significantly improve response times and decision-making in life-critical scenarios.
  • Cross-platform integration: We learned how to seamlessly integrate various tools, from vision AI to TTS to drone dispatch, using Fetch.ai and sponsor technologies.
  • Working with real-time data: Developing an autonomous system that processes data in real-time provided insights into handling complex workflows.

What's next for Autonomous AI Society

  • Scaling to more disasters: Expanding the system to handle other types of natural disasters like wildfires or earthquakes.
  • Edge deployment: Enabling drones and agents to run on the edge to reduce response times further.
  • Improved human detection: Enhancing human detection with more precise models to handle low-light or difficult visual conditions.
  • Expanded rescue communication: Integrating real-time communication with the victims themselves using Deepgram’s speech technology.

Built With

Share this project:

Updates