Examples and Intuitions II — Building XNOR with a Hidden Layer
In the previous section, we saw how to implement basic logical gates (AND, OR, NOR) using single neurons. However, some functions like XOR and XNOR cannot be represented by a single neuron. In this post, we will see how adding a hidden layer allows us to model the XNOR function.
Examples and Intuitions II — Building XNOR with a Hidden Layer
Logical Gates as Neural Networks
From the previous section, we constructed single-neuron networks for:
AND
NOR
OR
Constructing XNOR
The logical XNOR operator outputs 1 when:
- and , or
- and
In other words, when both inputs are the same.
A single neuron cannot represent XNOR.
We need a hidden layer.
Network Architecture
Where:
- Hidden unit 1 implements AND
- Hidden unit 2 implements NOR
- Output layer implements OR
Step 1 — First Layer (AND + NOR)
We combine AND and NOR into one matrix:
This gives:
So:
- behaves like AND
- behaves like NOR
Step 2 — Second Layer (OR)
Now we combine the hidden outputs using OR:
This gives:
Final hypothesis:
Full Computation
The forward propagation is:
Why This Works
The hidden layer computes:
- AND(x₁, x₂)
- NOR(x₁, x₂)
The output layer computes:
Which is exactly:
Key Insight
- Single logistic neuron → can model AND, OR, NOR
- Cannot model XOR or XNOR
- Adding one hidden layer enables nonlinear decision boundaries
This is the first concrete example of why hidden layers matter.
