COMP9444 Neural Networks and Deep Learning
Term 2, 2020

Exercises 1: Perceptrons


  1. Perceptron Learning

    1. Construct by hand a Perceptron which correctly classifies the following data; use your knowledge of plane geometry to choose appropriate values for the weights w0, w1 and w2.

      Training Examplex1 x2Class
      a.01-1
      b.20-1
      c.11+1

    2. Demonstrate the Perceptron Learning Algorithm on the above data, using a learning rate of 1.0 and initial weight values of

      w0 = -1.5
      w1 =   0
      w2 =   2

      In your answer, you should clearly indicate the new weight values at the end of each training step. Here, the first three steps have been done for you. You need to fill in the rest of the table.

    Iteration w0w1w2 Training Example x1x2Class s=w0+w1x1+w2x2 Action
    1-1.502 a.01- +0.5Subtract
    2-2.501 b.20- -2.5None
    3-2.501 c.11+ -1.5Add
  2. XOR Network

    Construct by hand a Neural Network (or Multi-Layer Perceptron) that computes the XOR function of two inputs. Make sure the connections, weights and biases of your network are clearly visible.

    Challenge: Can you construct a Neural Network to compute XOR which has only one hidden unit, but also includes shortcut connections from the two inputs directly to the (one) output.
    Hint: start with a network that computes the inclusive OR, and then try to think of how it could be modified.

  3. Computing any Logical Function with a 2-layer Network

    Assuming False=0 and True=1, explain how each of the following could be constructed:

    1. Perceptron to compute the OR function of m inputs
    2. Perceptron to compute the AND function of n inputs
    3. 2-layer Neural Network to compute the function (A ∨ B) ∧ (¬ B ∨ C ∨ ¬ D) ∧ (D ∨ ¬ E)
    4. 2-Layer Neural Network to compute any (given) logical expression, assuming it is written in Conjunctive Normal Form.

    Hint: in each case, first decide on the input-to-output or input-to-hidden weights, then determine the bias.