Multibit neural XOR network
I'm trying to train an 8-bit neural network to output XOR of its inputs. I'm using ffnet library (http://ffnet.sourceforge.net/). For low number of input bits (up to 4) backpropagation produces expected results. For 8 bits, NN seems to 'converge', meaning that it outputs the same value for any input. I'm using a multilayer NN: inputs, hidden layer, output, plus bias node.
Am I doing something wrong? Does this NN need to be of certain shape, to be able to learn to XOR?
Edit:
This is the code I'm using:
def experiment(bits, input, solution, iters):
conec开发者_如何学运维 = mlgraph( (bits, bits, 1) )
net = ffnet(conec)
net.randomweights()
net.train_momentum(input, solution, eta=0.5, momentum=0.0, maxiter=iters)
net.test(input, solution, iprint=2)
I'm using momentum=0.0
to get pure back-propagation.
This is a part of the results I get:
Testing results for 256 testing cases:
OUTPUT 1 (node nr 17):
Targets vs. outputs:
1 1.000000 0.041238
2 1.000000 0.041125
3 1.000000 0.041124
4 1.000000 0.041129
5 1.000000 0.041076
6 1.000000 0.041198
7 0.000000 0.041121
8 1.000000 0.041198
It goes on like this for every vector (256 values)
精彩评论