Examples and Intuitions II — Building XNOR with a Hidden Layer
In the previous section, we saw how to implement basic logical gates (AND, OR, NOR) using single neurons. However, some functions like XOR and XNOR cannot be represented by a single neuron. In this post, we will see how adding a hidden layer allows us to model the XNOR function.
← Previous
Examples and Intuitions I — Neural Networks as Logical Gates
Next →
Multiclass Classification with Neural Networks
Complex Logical Gates with Neural Networks
Implementing XNOR
The logical XNOR operator outputs 1 when:
- and , or
- and
In other words, when both inputs are the same.
A single neuron cannot represent XNOR.
We need a hidden layer.
| Result | ||
|---|---|---|
| 0 | 0 | 1 |
| 1 | 0 | 0 |
| 0 | 1 | 0 |
| 1 | 1 | 1 |
Network Architecture
Where:
- Hidden unit 1 implements AND
- Hidden unit 2 implements NOR
- Output layer implements OR
First Layer (AND + NOR)
We combine AND and NOR into one matrix:
This gives:
So the hidden layer computes:
- behaves like AND
- behaves like NOR
Second Layer (OR)
The output layer computes:
Now we combine the hidden outputs using OR:
This gives:
Final hypothesis:
Full Computation
The forward propagation is:
Key Insight
- Single logistic neuron → can model AND, OR, NOR
- Cannot model XOR or XNOR
- Adding one hidden layer enables nonlinear decision boundaries
This is the first concrete example of why hidden layers matter.
