Inspiration

Deep neural networks are notoriously difficult to understand, making it difficult to iteratively improve models and understand what's working well and what's working poorly.

What it does

ENNUI allows for fast iteration of neural network architectures. It provides a unique blend of functionality for an expert developer and ease of use for those who are still learning. ENNUI visualizes and allows for modification of neural network architectures. Users are able to construct any (non-recurrent) architecture they please.

We take care of input shapes and sizes by automatically flattening and concatenating inputs where necessary. Furthermore, it provides full access to the underlying implementation in Python / Keras. Neural network training is tracked in real time and can be performed both locally and on the cloud.

How we built it

We wrote a Javascript frontend with an elegant drag and drop interface. We built a Python back end and used the Keras framework to build and train the user's neural network. To convert from our front end to our back end we serialize to JSON in our own format, which we then parse.

Challenges we ran into

Sequential models are simple. Models that allow arbitrary branching and merging are less so. Our task was to convert a graph-like structure into functional python code. This required checks to ensure that the shapes and sizes of various tensors matched, which proved challenging when dealing with tensor concatenation.

There are many parameters in a neural network. It was challenging to design an interface which allows the user access to all of them in an intuitive manner.

Integrating with the cloud was challenging, specifically, using Docker and Kubernetes for deployment.

Accomplishments that we're proud of

A novel and state-of-the-art development and learning tool for deep neural networks. The amount of quality code we produced in 24 hours.

What's next for Elegant Neural Network User Interface (ENNUI)

We want to add a variety of visualizations both during and after training. We want the colors of each of the layers to change based on how influential the weight are in the network. This will help developers understand the effects of gradient updates and identify sections of the network that should be added to or pruned. Fundamentally, we want to inspire a change in the development style from people waiting until they have a trained model to change parameters or architecture, to watching training and making changes to the network architecture as needed. We're want to add support for frameworks other than Keras, such as Tensorflow, Pytorch, and CNTK. Furthermore, we'd love to see our tool used in an educational environment to help students better understand neural networks.

Built With

Share this project:
×

Updates