Shannon-Hartley Theorem
The Shannon-Hartley theorem tells you the maximum amount of information you can push through a noisy channel. It’s a fundamental limit.
The formula:
- C = channel capacity (bits per second)
- B = bandwidth (Hz)
- S/N = signal-to-noise ratio
More bandwidth means more capacity. Better signal-to-noise ratio means more capacity. But notice the log - doubling your SNR doesn’t double your capacity. You get diminishing returns on power.
This is why you can’t just boost signal strength forever. And why bandwidth is so valuable - it scales linearly while power scales logarithmically.
The theorem sets a ceiling. No encoding scheme can do better than C bits per second on a channel with bandwidth B and noise ratio S/N. You can get arbitrarily close with clever encoding (and Shannon proved this was possible), but never exceed it.
Real-world implications:
- Why 5G needs more spectrum, not just better encoding
- Why deep space communication is hard (signal gets weak, noise stays constant)
- Why cable internet maxes out - the copper has finite bandwidth and you can’t eliminate noise
The theorem assumes Gaussian noise and steady conditions, which isn’t perfectly realistic. But it’s close enough to be genuinely useful as a design constraint. When you’re wondering why a channel can’t go faster, Shannon-Hartley is usually part of the answer.