Quantcast
Channel: Why doesn't my Feed-Forward NN work with varying inputs? - Stack Overflow
Viewing all articles
Browse latest Browse all 3

Why doesn't my Feed-Forward NN work with varying inputs?

0
0

I decided to create a feedforward Neural Network without using any libraries. I am fairly new to the subject and completely self-trained.

My Neural Network uses backpropagation to set the weights and the activation function between all layers (input-hidden1-output) is a Sigmoid function. Let's say that I try to solve a basic problem like the XOr logic gate problem with my NN. Whenever i use the complete training set (all the possible combinations of 1s and 0s) my NN cannot set the weights in such a way that it could produce the desired output. Seemingly it always stops at the middle. (output is ~0.5 in all cases) On the other hand, when I only iterate one type of input (Let's say 0 and 1) it quickly learns.

Is there a problem in my cost function, number of nodes, hidden layers or what? I would appreciate some guiding words!


Viewing all articles
Browse latest Browse all 3

Latest Images

Trending Articles





Latest Images