variableneuralnet

A Neural Network Package based largely on the binaryneuralnet (@https://github.com/reecemartin/binaryneuralnet) but, with custom numbers of input and output as well as hidden layers. This means users will be able to create feed-forward neural networks with varying structures allowing for significantly more possible applications. These networks are a fundammental machine learning construvt which can be used for tasks such as pattern recognition.

What is does:

Class Breakdown:

VariableNeuralNetwork Class: Organizes and coordinates the initialization, training, and testing of the network.

Neuron Class: Used to represent the networks Neurons (i.e vertices in the graphs).

InputNeuron Class: Represents Input Neurons.

OutputNeuron Class: Represents Output Neurons.

Weight Class: Used to represent the networks Weights (i.e. edges in the graphs).

UserInterface Class: Used to manage User I/O in conjunction with the VariableNeuralNetwork Class.

Training Class: Used for training of the network.

Testing Class: Used for testing i.e. calculation of the network.

How it works:

Development Plan:

Acknowlegements:

As with the binaryneuralnet (@https://github.com/reecemartin/binaryneuralnet) project, this project is largely inspired by and based on information and techniques from Jeff Heaton's videos and lectures on Neural Networks posted on Youtube: https://www.youtube.com/user/HeatonResearch

We also would like to acknowlege Ray Kurzweil's excellent book "How to Create a Mind" which provided inspiration for this project as well as a broad overview on Machine Learning. (Amazon Link: https://www.amazon.ca/How-Create-Mind-Thought-Revealed/dp/0143124048)

For more technical details, filling in gaps, and expanding on information from the previously mentioned lectures and videos we utilized Wikipedias excellent in-depth pages on Backpropagation (https://en.wikipedia.org/wiki/Backpropagation), Perceptrons (https://en.wikipedia.org/wiki/Perceptron), Multi-Layer Perceptrons (https://en.wikipedia.org/wiki/Multilayer_perceptron), Activation Functions (https://en.wikipedia.org/wiki/Activation_function), the Logistic Curve (https://en.wikipedia.org/wiki/Logistic_function), and Gradient Descent (https://en.wikipedia.org/wiki/Gradient_descent).

Built With

Share this project:

Updates