A neural network uses nodes and edges to simulate how a biological brain "learns". Using gradient descent and linear algebra, we can create highly accurate real-world models with enough training data. The secret to neural networks is backpropagation, which is a mechanism to distribute the training errors across the nodes in the network. Below you can train a 3-layer Neural Network to recognize digits. Select the size of your training set, the number of cycles, and the nodes in the hidden layer. You can even input your own handwritten digit using the canvas and by clicking "Predict Digit". Try it out!
Processed Drawing Shown here