In the age of bots and VR , we wanted to develop a natural interface for jira - an interface that gels well with human.
What it does
Jira bot understands speech and text, and perform task like "get me jira abc-123" , "show comments for jira abc-123". Jira bot understands context - for example if you are asking to "show a jira" then it would ask you missing attributes (jira number in this case) . It gets real time personalised notifications too.
How we built it
UI is a chrome app . UI and backend talks on WebSocket . Backend uses third party NLP to understand command, it has command context to understand state of command and reacts on that state .Command fulfilment happens through jira apis. There is a hook to add new command definitions. For Real time Notifications we use redis pubsub.
Challenges we ran into
tuning natural langauage to actionable commands ( Jira Apis in this case ) was challenging, we have to feed NLP lib with adequate training data .
Accomplishments that we're proud of
We wanted to showcase that voice/text interface is useful for Jira like tools and we have done a subset of that. We successfully developed a way to convert text/speech into jira apis . Our Notifications are real time.
What's next for JIRA Assistant
This bot is proof of concept, in future we could add more workflows on it with collaborations :)