Inspiration
If the future of warfare is autonomous mass, we need to be able to command and control large numbers of autonomous systems.
What it does
HYDRA takes a natural language command or piece of intel, digests it through it various heads to identify target locations, identify which drones are viable for the mission, and ultimately plan and execute the missions by communicating with the selected units using the MAVLINK protocol.
Challenges we ran into
LLM output data formatting. I had to set very strict guardrails around the desired output of the various heads to ensure a consistent data pipe.
Accomplishments that we're proud of
The fact it is an end to end solution, where we start with a natural language command and the final output are the radio communication commands to the simulated (or real drones)
What we learned
Newer LLM models are harder to control since they tend to be more verbose. Since each head tries to accomplish a specific task, using the smaller and older models served me well.
Built With
- gazebo
- mavlink
- openai
- pymavlink
- python
- qgroundcontrol
Log in or sign up for Devpost to join the conversation.