Training Example | x1 | x2 | Class |
a. | 0 | 1 | -1 |
b. | 2 | 0 | -1 |
c. | 1 | 1 | +1 |
w0 = -1.5
w1 = 0
w2 = 2
In your answer, you should clearly indicate the new weight values at the end of each training step. Here, the first three steps have been done for you. You need to fill in the rest of the table.
Iteration w0 w1 w2 Training Example x1 x2 Class s=w0+w1x1+w2x2 Action 1 -1.5 0 2 a. 0 1 - +0.5 Subtract 2 -2.5 0 1 b. 2 0 - -2.5 None 3 -2.5 0 1 c. 1 1 + -1.5 Add
Construct by hand a Neural Network (or Multi-Layer Perceptron) that computes the XOR function of two inputs. Make sure the connections, weights and biases of your network are clearly visible.
Challenge:
Can you construct a Neural Network to compute XOR which has only one
hidden unit, but also includes shortcut connections from the two
inputs directly to the (one) output.
Hint: start with a network that computes the inclusive OR, and then try to think of how it could be modified.
Assuming False=0 and True=1, explain how each of the following could be constructed:
Hint: in each case, first decide on the input-to-output or input-to-hidden weights, then determine the bias.