Quantcast
Channel: Why doesn't my Feed-Forward NN work with varying inputs? - Stack Overflow
Browsing latest articles
Browse All 3 View Live

Answer by Vigneswaran C for Why doesn't my Feed-Forward NN work with varying...

XOR problem is not linearly separable and makes single layer perceptron unfit. However, in your network addition of hidden layer makes the network to capture non-linear features, which makes it fine....

View Article



Answer by fateme12 for Why doesn't my Feed-Forward NN work with varying inputs?

XOR can't be solved with one hidden layer. Because you can't separate your labels (0 and 1) with just one line. You can separate them with two lines and then use AND gate (another hidden layer) to find...

View Article

Why doesn't my Feed-Forward NN work with varying inputs?

I decided to create a feedforward Neural Network without using any libraries. I am fairly new to the subject and completely self-trained. My Neural Network uses backpropagation to set the weights and...

View Article
Browsing latest articles
Browse All 3 View Live




Latest Images