Skip to content
Snippets Groups Projects
Commit ff77fc32 authored by Erik Strand's avatar Erik Strand
Browse files

Simplify my proof of the continuity of entropy

parent c8e67614
No related branches found
No related tags found
No related merge requests found
......@@ -38,50 +38,12 @@ function on its domain. Then $$x \log(x)$$ is also continuous, since finite prod
functions are continuous. This suffices for $$x > 0$$. At zero, $$x \log x$$ is continuous because
we have defined it to be equal to the limit we found above.
Thus each term of the entropy function is a continuous function from $$\mathbb{R}$$ to
$$\mathbb{R}$$. This suffices to show that negative entropy is continuous, based on the lemma below.
Thus entropy is continuous, since negation is a continuous function, and finite compositions of
continuous functions are continuous.
The necessary lemma is easy to prove, if symbol heavy. I will use the
[$$L^1$$ norm](https://en.wikipedia.org/wiki/Norm_(mathematics)#p-norm), but this is without loss of
generality because all norms on a finite dimensional vector space induce the same topology. Let
$$f : \mathbb{R}^n \to \mathbb{R}$$ and $$g : \mathbb{R} \to \mathbb{R}$$ be continuous functions,
and define $$h : \mathbb{R}^{n + 1} \to \mathbb{R}$$ as
$$h(x_1, \ldots, x_{n + 1}) = f(x_1, \ldots, x_n) + g(x_{n + 1})$$
Fix any $$x = (x_1, \ldots, x_{n + 1}) \in \mathbb{R}^{n + 1}$$, and any $$\epsilon > 0$$. Since
$$f$$ is continuous, there exists some positive $$\delta_f$$ such that for any $$y \in
\mathbb{R}^n$$, $$\lVert (x_1, \ldots, x_n) - (y_1, \ldots, y_n) \rVert < \delta_f$$ implies
$$\lVert f(x_1, \ldots, x_n) - f(y_1, \ldots, y_n) \rVert < \epsilon / 2$$. For the same reason
there is a similar $$\delta_g$$ for $$g$$. Let $$\delta$$ be the smaller of $$\delta_f$$ and
$$\delta_g$$. Now fix any $$y \in \mathbb{R}^{n + 1}$$ such that $$\lVert x - y \rVert < \delta$$.
Note that
$$
\begin{align*}
\lVert (x_1, \ldots, x_n) - (y_1, \ldots, y_n) \rVert &= \sum_{i = 1}^n \lVert x_i - y_i \rVert \\
&\leq \sum_{i = 1}^{n + 1} \lVert x_i - y_i \rVert \\
&< \delta_f
\end{align*}
$$
and similarly for the projections of $$x$$ and $$y$$ along the $$n + 1$$st dimension. Thus
$$
\begin{align*}
\lVert h(x) - h(y) \rVert
&= \lVert f(x_1, \ldots, x_n) + g(x_{n + 1}) - f(y_1, \ldots, y_n) + g(y_{n + 1}) \rVert \\
&\leq \lVert f(x_1, \ldots, x_n) - f(y_1, \ldots, y_n) \rVert +
\lVert g(x_{n + 1}) + g(y_{n + 1}) \rVert \\
&< \frac{\epsilon}{2} + \frac{\epsilon}{2} \\
&= \epsilon
\end{align*}
$$
It follows that $$h$$ is continuous.
Thus each term of the entropy function is a continuous function from $$\mathbb{R}_{\geq 0}$$ to
$$\mathbb{R}$$. But we can also view each term as a function from $$\mathbb{R}_{\geq 0}^n$$ to
$$\mathbb{R}$$. Each one ignores most of its inputs, but this doesn't change its continuity. (The
epsilon-delta proof is simple, since the only part of the distance between inputs that matters is
that along the active coordinate.) So entropy is a sum of continuous functions, and is thus
continuous.
### Non-negativity
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please to comment