1.
Let mu and nu be two Bernoulli measures on {0,1}, with biases p and q, respectively. Thus,
mu({1})=p=1-mu({0})
nu({1})=q=1-nu({0})
Let P and Q be the n-fold products of mu and nu, respectively. Thus, P and Q are two probability measures on {0,1}^n.
Show that
||P-Q|| <= n|p-q|
where ||.|| is the total variation norm (and <= is a poor man's way of writing \leq in plain text).
2.
Now assume that q = 1/2; thus Q is the uniform measure on {0,1}^n.
Show that
||P-Q|| <= C n^(1/2) |p-1/2|
where C>0 is some fixed universal constant. (Hint: Pinsker's inequality might come in handy.)
Are these novel or already known?
Subscribe to:
Post Comments (Atom)
3 comments:
Known, I would say. In fact, so long as p and q are close and not too close to 0 and 1, then the total variation distance is O(sqrt(n)).
See, e.g., Exercise 21 (with solution) here:
http://www.stat.yale.edu/~pollard/Books/Asymptopia/Metrics.pdf
(or some of the calculations here:
http://michaelnielsen.org/polymath1/index.php?title=Passing_between_measures
)
For any two distributions P,Q on an arbitrary domain, with statistical distance d, the statistical distance between P^n, Q^n grows as O(\sqrt(n)H(P,Q)).
where H(P,Q) is the Hellinger distance between distributions P,Q.
This can be found in Lemmas 3.1,3.2 in the paper :
Rounding Parallel Repetitions of Unique Games
http://www.cs.princeton.edu/~dsteurer/roundpar.pdf
Many thanks to Ryan and Prasad for the helpful pointers!
Post a Comment