A Training assistant focused on presenting astronauts in training with scenarios and problems to solve based on real life situation that may occur on a space exploration mission
Inspiration - We just wanted to have fun, code together, and compete.
What it does - It is an LLM assistant designed to help astronauts with training by asking them questions based on real world emergency scenarios that can happen during space exploration, to test their knowledge and readiness, so they will be prepared if they ever find themselves in a situation like that.
How we built it - We utilized HTML, Java Script and CSS to built the front end then to display the responses coming from the back end, the back end API that facilitates these communications is made in Node.JS, the model that the API communicates with is Llama2 that is contained in a micro-service that uses O llama to call the model and uses Llama index to enable and manage the indexed of text documents in order to support querying from those index documents, and in order to improve its content awareness.
Challenges we ran into - A surprising lack of documentation for a lot of the technologies and also the task of improving a LLM content awareness.
Accomplishments that we're proud of - We are proud of our front end.
What we learned - The idea that increasing an existing models content awareness could become a reasonable alternative to creating a new model.
What's next for SpaceDuck - Allow the model to present the user with a question and to provide the user an answer.
Log in or sign up for Devpost to join the conversation.