Skip to content
Snippets Groups Projects
Commit b33252ff authored by Erik Strand's avatar Erik Strand
Browse files

Add answer for 4.3

parent a0bc6d37
Branches
No related tags found
No related merge requests found
......@@ -179,10 +179,7 @@ $$
{:.question}
Consider a binary channel that has a small probability $$\epsilon$$ of making a bit error.
$$
\begin{align*}
\end{align*}
$$
For reasons that will become clear I will call the error probability $$\epsilon_0$$.
### (a)
......@@ -190,16 +187,147 @@ $$
What is the probability of an error if a bit is sent independently three times and the value
determined by majority voting?
Majority voting can recover the message if a single instance of the bit is flipped. So the
probability of an error is the probability of having two or three bits flipped. This can be
expressed using the binomial distribution. Let's call it $$\epsilon_1$$.
$$
\begin{align*}
\epsilon_1 &= B(2; \epsilon_0, 3) + B(3, \epsilon_0, 3) \\
&= {3 \choose 2} \epsilon_0^2 (1 - \epsilon_0) + {3 \choose 3} \epsilon_0^3 \\
&= 3 \epsilon_0^2 (1 - \epsilon_0) + \epsilon_0^3 \\
&= 3 \epsilon_0^2 - 2 \epsilon_0^3
\end{align*}
$$
### (b)
{:.question}
How about if that is done three times, and majority voting is done on the majority voting?
The answer is the same as above, just using $$\epsilon_1$$ instead of $$\epsilon_0$$.
$$
\begin{align*}
\epsilon_2 &= B(2; \epsilon_1, 3) + B(3, \epsilon_1, 3) \\
&= 3 \epsilon_1^2 - 2 \epsilon_1^3
\end{align*}
$$
### (c)
{:.question}
If majority voting on majority voting on … on majority voting is done N times, how many bits are
needed, and what is the probability of an error? How does this probability depend on $$\epsilon$$?
If majority voting on majority voting on … on majority voting is done N times, how many bits
are needed, and what is the probability of an error? How does this probability depend on
$$\epsilon$$?
Each round triples the total number of bits sent. So $$n$$ rounds of voting requires $$3^n$$ bits.
The probability of an error can be expressed as a recurrence relation. Define
$$
\begin{align*}
f_0(x) &= 3 x^2 - 2 x^3 \\
f_{n+1}(x) &= f_0(f_n(x))
\end{align*}
$$
Then with a base error rate (i.e. per individual bit) of $$\epsilon_0$$, the probability of an error
after $$n$$ rounds of voting is $$\epsilon_n = f_n(\epsilon_0)$$. If there is a closed form solution
to this relation, I don't know how to find it. But it's still possible to say a lot about its
behavior.
### Convergence to a step function
As $$n$$ approaches infinity, $$f_n$$ converges pointwise to
$$
f_\infty(x) =
\begin{cases}
0 & \text{for} & 0 \leq x < 1/2 \\
1/2 & \text{for} & x = 1/2 \\
1 & \text{for} & 1/2 \leq x \leq 1
\end{cases}
$$
So as long as the error rate isn't $$1/2$$ (i.e. zero information gets through), with enough rounds
of majority voting the error rate can be made arbitrarily small.
Let's prove this. Since $$f_0$$ is a polynomial, it's continuous. By inspection there are three
solutions to $$x = 3 x^2 - 2 x^3$$: zero, one half, and one. Thus $$f_0$$ has three
[fixed points](https://en.wikipedia.org/wiki/Fixed_point_(mathematics)). This suffices to show that
for any $$x$$, if $$f_n(x)$$ converges it must converge to zero, one half, or one.
Now fix any $$x$$ such that $$0 < x < 1/2$$. Because $$f_0$$ is a cubic polynomial, it can't cross the
line $$y = x$$ more than three times. We've shown that it does cross this line exactly three times
(at zero, one half, and one). So noting that $$3 \cdot 0.25^2 - 2 \cdot 0.25^3 = 5/32 \approx 0.16$$
is sufficient to prove that $$f_0(x) < x$$. Furthermore, $$3 x^2$$ is greater than $$2 x^3$$, so
$$0 < f_0(x)$$. Thus $$0 < f_0(x) < x < 1/2$$. By induction this shows that
$$
0 < \cdots < f_2(x) < f_1(x) < f_0(x) < x
$$
Thus $$f_n(x)$$ is a bounded monotonic sequence, and must converge. Since $$x < 1/2$$ the only fixed
point it can converge to is zero.
All that remains is to show that all points in $$(1/2, 1)$$ converge to one. Note that
$$
\begin{align*}
1 - f_0(1 - x)
&= 1 - 3(1 - x)^2 + 2(1 - x)^3 \\
&= 1 - (1 - x)^2 (3 - 2(1 - x)) \\
&= 1 - (1 - 2x + x^2) (1 + 2x)) \\
&= 1 - (1 + 2x - 2x - 4x^2 + x^2 + 2x^3) \\
&= 3x^2 - 2x^3 \\
&= f_0(x)
\end{align*}
$$
This symmetry establishes the claim.
### Behavior of leading term
By induction it's clear that $$f_n$$ is a polynomial for all $$n$$. So another way to look at
$$f_n$$ is to consider its lowest degree term. In particular consider $$g_0(x) = 3x^2$$, with
$$g_n$$ defined recursively in a similar fashion as $$f_n$$.
For any $$x \in (0, 1]$$, it's clear that $$f_0(x) < g_0(x)$$. And for any $$x$$ and $$y$$ such that
$$0 < x < y < 1$$,
$$0 < 3x^2 - 2x^3 < 3x^2 < 3y^2 < 1$$
so $$0 < f_0(x) < g_0(y) < 1$$. Thus by induction $$f_n(x) < g_n(x)$$ for any x in $$(0, 1]$$ and
for all $$n$$. So $$g_n(\epsilon)$$ is an upper bound for the probability of an error after $$n$$
rounds of majority voting with a base error rate of $$\epsilon$$.
The interesting thing about $$g_n$$ is that it has a closed form solution that's easy to find.
$$
g_n(x) = 3^{2^n - 1} x^{2^n}
$$
So while each round of voting increases the number of bits by a factor of three, the *exponent* on
top of the base error rate grows by a factor of two. This is an astonishing win for majority voting.
### Examples
Here is a table showing error rates after various numbers of voting rounds for a variety of base
rates.
|voting rounds| 0 | 1 | 2 | 3 | 4 | 5 |
|---|---|---|---|---|---|---|
|bits|1 |3 |9 |27 |81 |243|
|---|---|---|---|---|---|---|
|$$p_\text{error}$$|0.25|$$\num{1.6e-1}$$|$$\num{6.6e-2}$$|$$\num{1.2e-3}$$|$$\num{4.5e-4}$$|$$\num{6.2e-7}$$|
|---|---|---|---|---|---|---|
|$$p_\text{error}$$|0.1|$$\num{2.8e-2}$$|$$\num{2.3e-3}$$|$$\num{1.6e-5}$$|$$\num{7.6e-10}$$|$$\num{1.8e-18}$$|
|---|---|---|---|---|---|---|
|$$p_\text{error}$$|0.01|$$\num{3.0e-4}$$|$$\num{2.7e-7}$$|$$\num{2.1e-13}$$|$$\num{1.4e-25}$$|$$\num{5.5e-50}$$|
|---|---|---|---|---|---|---|
|$$p_\text{error}$$|0.001|$$\num{3.0e-6}$$|$$\num{2.7e-11}$$|$$\num{2.2e-21}$$|$$\num{1.4e-41}$$|$$\num{6.1e-82}$$|
|---|---|---|---|---|---|---|
## (4.4)
......@@ -213,6 +341,11 @@ Calculate the differential entropy of a Gaussian process.
{:.question}
A standard telephone line is specified to have a bandwidth of 3300 Hz and an SNR of 20 dB.
$$
\begin{align*}
\end{align*}
$$
### (a)
{:.question}
......
......@@ -32,3 +32,16 @@ a:visited {
.question {
font-style: italic;
}
table {
margin: 1.6em auto;
}
table, th, td {
border: 1px solid black;
border-collapse: collapse;
}
th, td {
padding: 5px 10px;
}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please to comment