Ask HN: What's the 'Hello World' program of neural networks?
MNIST data, which is the hand written number data. Download Tensorflow and do the examples. You will find it there.
Tensorflow has a whole bunch of tutorials, but those are the "Hello World"s of tensorflow, not of neural networks.
In order to get started with neural networks, begin with drawing simple neural nets for basic operations like addition, multiplication, XOR. Just represent boolean tables as neural networks.
Once you can do that, move on to implementing the algorithm yourself. A simple 3 layer network is enough to understand how the concept works. 4/2/2 nodes is plenty. Just understand how the calculations work.
Then move on to a framework - only after you understood the math. The machine learning course on coursera by Andrew Ng(?) explains the algorithms.
XOR for fully connected networks, and mnist for convolutional networks
Make sure to implement backprop yourself: don't use Tensorflow, just use numpy and write out the matrix multiplications yourself.
Not bad advice, but to get a feel before you dive into the deep end, use the libraries and then, by all means, follow this advice. You should always know what's underlying the functionality a library provides, but don't hesitate to start simple. I remember "C++ Neural Networks and Fuzzy Logic" at 16 and being intimidated because I had to develop and understand the code without having a background in statistics, calculus, or linear algebra. Don't let it intimidate you.
If backprop is "the deep end," then I really don't know what you're doing...
Probably the XOR dataset.
Not trying to be a smartass, if "Hello, World" is the most basic program, why would a single layer perceptron model not be what OP is looking for?
Because that would be the equivalent of telling someone to write "Hello World" on a piece of paper in order to show them how programming works.
A single layer isn't a "neural network". The difference between logistic regression and a really simple neural network is the ability to behave in a non-linear way.
A single layer amounts to a matrix multiplication. Matrices are linear operators.
1. using a simple nn to do xor bit operation
2. mnist hand written digit recognition
XoR
I once implemented a simple back propagation algorithm in Haskell (without any libraries) that could identify the pattern (one amoung 'A', 'B', 'C' or 'D') represented on an 8x8 matrix...
Here is the code..
https://bitbucket.org/sras/haskell-stuff/src/b58f3fc017ce303...