Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
P
pit
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
GitLab community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Erik Strand
pit
Commits
ff77fc32
Commit
ff77fc32
authored
Feb 25, 2019
by
Erik Strand
Browse files
Options
Downloads
Patches
Plain Diff
Simplify my proof of the continuity of entropy
parent
c8e67614
No related branches found
No related tags found
No related merge requests found
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
_psets/3.md
+6
-44
6 additions, 44 deletions
_psets/3.md
with
6 additions
and
44 deletions
_psets/3.md
+
6
−
44
View file @
ff77fc32
...
...
@@ -38,50 +38,12 @@ function on its domain. Then $$x \log(x)$$ is also continuous, since finite prod
functions are continuous. This suffices for $$x > 0$$. At zero, $$x
\l
og x$$ is continuous because
we have defined it to be equal to the limit we found above.
Thus each term of the entropy function is a continuous function from $$
\m
athbb{R}$$ to
$$
\m
athbb{R}$$. This suffices to show that negative entropy is continuous, based on the lemma below.
Thus entropy is continuous, since negation is a continuous function, and finite compositions of
continuous functions are continuous.
The necessary lemma is easy to prove, if symbol heavy. I will use the
[
$$L^1$$ norm
](
https://en.wikipedia.org/wiki/Norm_(mathematics
)
#p-norm), but this is without loss of
generality because all norms on a finite dimensional vector space induce the same topology. Let
$$f :
\m
athbb{R}^n
\t
o
\m
athbb{R}$$ and $$g :
\m
athbb{R}
\t
o
\m
athbb{R}$$ be continuous functions,
and define $$h :
\m
athbb{R}^{n + 1}
\t
o
\m
athbb{R}$$ as
$$h(x_1,
\l
dots, x_{n + 1}) = f(x_1,
\l
dots, x_n) + g(x_{n + 1})$$
Fix any $$x = (x_1,
\l
dots, x_{n + 1})
\i
n
\m
athbb{R}^{n + 1}$$, and any $$
\e
psilon > 0$$. Since
$$f$$ is continuous, there exists some positive $$
\d
elta_f$$ such that for any $$y
\i
n
\m
athbb{R}^n$$, $$
\l
Vert (x_1,
\l
dots, x_n) - (y_1,
\l
dots, y_n)
\r
Vert <
\d
elta_f$$ implies
$$
\l
Vert f(x_1,
\l
dots, x_n) - f(y_1,
\l
dots, y_n)
\r
Vert <
\e
psilon / 2$$. For the same reason
there is a similar $$
\d
elta_g$$ for $$g$$. Let $$
\d
elta$$ be the smaller of $$
\d
elta_f$$ and
$$
\d
elta_g$$. Now fix any $$y
\i
n
\m
athbb{R}^{n + 1}$$ such that $$
\l
Vert x - y
\r
Vert <
\d
elta$$.
Note that
$$
\b
egin{align
*
}
\l
Vert (x_1,
\l
dots, x_n) - (y_1,
\l
dots, y_n)
\r
Vert &=
\s
um_{i = 1}^n
\l
Vert x_i - y_i
\r
Vert
\\
&
\l
eq
\s
um_{i = 1}^{n + 1}
\l
Vert x_i - y_i
\r
Vert
\\
&<
\d
elta_f
\e
nd{align
*
}
$$
and similarly for the projections of $$x$$ and $$y$$ along the $$n + 1$$st dimension. Thus
$$
\b
egin{align
*
}
\l
Vert h(x) - h(y)
\r
Vert
&=
\l
Vert f(x_1,
\l
dots, x_n) + g(x_{n + 1}) - f(y_1,
\l
dots, y_n) + g(y_{n + 1})
\r
Vert
\\
&
\l
eq
\l
Vert f(x_1,
\l
dots, x_n) - f(y_1,
\l
dots, y_n)
\r
Vert +
\l
Vert g(x_{n + 1}) + g(y_{n + 1})
\r
Vert
\\
&<
\f
rac{
\e
psilon}{2} +
\f
rac{
\e
psilon}{2}
\\
&=
\e
psilon
\e
nd{align
*
}
$$
It follows that $$h$$ is continuous.
Thus each term of the entropy function is a continuous function from $$
\m
athbb{R}_{
\g
eq 0}$$ to
$$
\m
athbb{R}$$. But we can also view each term as a function from $$
\m
athbb{R}_{
\g
eq 0}^n$$ to
$$
\m
athbb{R}$$. Each one ignores most of its inputs, but this doesn't change its continuity. (The
epsilon-delta proof is simple, since the only part of the distance between inputs that matters is
that along the active coordinate.) So entropy is a sum of continuous functions, and is thus
continuous.
### Non-negativity
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
sign in
to comment