Skip to content
Snippets Groups Projects
Commit a0bc6d37 authored by Erik Strand's avatar Erik Strand
Browse files

Add answer for 4.2

parent d5573e2a
Branches
No related tags found
No related merge requests found
...@@ -119,6 +119,9 @@ increases. ...@@ -119,6 +119,9 @@ increases.
### Independence ### Independence
If $$p$$ and $$q$$ are independent, their joint probability distribution is the product of the
individual distributions. Thus
$$ $$
\begin{align*} \begin{align*}
H(p, q) &= -\sum_{i = 1}^n \sum_{j = 1}^m p_i q_j \log(p_i q_j) \\ H(p, q) &= -\sum_{i = 1}^n \sum_{j = 1}^m p_i q_j \log(p_i q_j) \\
...@@ -138,8 +141,35 @@ $$ ...@@ -138,8 +141,35 @@ $$
{:.question} {:.question}
Prove the relationships in Equation (4.10). Prove the relationships in Equation (4.10).
I take $$I(x, y) = H(x) + H(y) - H(x, y)$$ as the definition of mutual information. By the
definition of conditional entropy,
$$
\begin{align*}
H(y | x) &= H(x, y) - H(x) \\
H(x | y) &= H(x, y) - H(y)
\end{align*}
$$
Thus
$$ $$
\begin{align*} \begin{align*}
I(x, y) &= H(y) - H(y | x) \\
&= H(x) - H(x | y)
\end{align*}
$$
Finally, using the definition of marginal distributions we can show that
$$
\begin{align*}
I(x, y)
&= -\sum_x p(x) \log p(x) - \sum_y p(y) \log p(y) + \sum_{x, y} p(x, y) \log p(x, y) \\
&= -\sum_{x, y} p(x, y) \log p(x) - \sum_{x, y} p(x, y) \log p(y) +
\sum_{x, y} p(x, y) \log p(x, y) \\
&= \sum_{x, y} p(x, y) \left( \log p(x, y) - \log p(x) - \log p(y) \right) \\
&= \sum_{x, y} p(x, y) \log \frac{p(x, y)}{p(x) p(y)}
\end{align*} \end{align*}
$$ $$
...@@ -149,6 +179,11 @@ $$ ...@@ -149,6 +179,11 @@ $$
{:.question} {:.question}
Consider a binary channel that has a small probability $$\epsilon$$ of making a bit error. Consider a binary channel that has a small probability $$\epsilon$$ of making a bit error.
$$
\begin{align*}
\end{align*}
$$
### (a) ### (a)
{:.question} {:.question}
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please to comment