Longest path search

A non-counting lower bound for the expected distance of a simple random walk

It's been a while since I've updated the blog (likely due to the fact that I've been struggling to get it to work with Github pages...). Anyways, it'll, at some point, be migrated over, but for now this will have to do.

This post will focus on a particular, nearly silly, proof of a lower bound for the distance of an unbiased random walk, defined as

X=i=1nXi,

where Xi~{±1}, uniformly. The quantity we want to find a lower bound to is

𝐄[|X|],

as n is large. We know from a basic, if somewhat annoying, counting argument that

𝐄[|X|]2πn,

when n1. In general, we're interested in bounds of the form

𝐄[|X|]Ω(n).

Bounds like these are applicable in a number of important lower bounds for online convex optimization (see, e.g., Hazan's lovely overview, section 3.2) though we won't be talking too much about the applications on this one.

Additionally, since 𝐄[X2]=n (which follows by expanding and using the fact that Xi are independent with mean zero) then

𝐄[|X|]𝐄[X2]=n,

so we know that this bound is tight up to a constant. The first inequality here follows from an application of Jensen's inequality to the square root function (which is concave).

Why a non-counting proof?

Mostly because I'm bad at counting and always end up with a hilarious number of errors. Plus, this proof is easily generalizable to a number of other similar results!

Proof idea

One simple method for lower-bounding the expectation of a variable like |X| is to note that |X| is nonnegative, so we have the following 'silly' bound

𝐄[|X|]𝐄[a1|X|a]=aPr(|X|a),

for any a0, where 1|X|a is the indicator function for the event |X|a, that is 1 if |X|a and zero otherwise. (The bound follows from the fact that |X|a1|X|a pointwise.) Maximizing over a, assuming we have a somewhat tight lower bound over the probability that |X|a, then this approach might give us a reasonable lower bound.

In a very general sense, we want to show that |X| is 'anticoncentrated'; i.e., it is reasonably 'spread out', which would indicate that its expectation cannot be too small, since it is nonnegative.

Attempt #1

The first idea (or, at least, my first idea) would be to note that, since 𝐄[X2] is on the order of n, then maybe we can use this fact to construct a bound for 𝐄[|X|] which 'should be' on the order of n assuming some niceness conditions, for example, that |X|n is a bounded variable.

Unfortunately, just these two simple facts are not enough to prove the claim! We can construct a nonnegative random variable Y0 such that its second moment is 𝐄[Y2]=n, it is bounded by Yn, yet 𝐄[Y]=1. In other words, we wish to construct a variable that is very concentrated around 0, with 'sharp' peaks at larger values.

Of course, the simplest example would be to take Y=n with probability 1/n and Y=0 with probability 11/n. Clearly, this variable is bounded, and has n as its second moment. On the other hand,

𝐄[Y]=(1/n)n+(11/n)0=1,

which means that the best bound we can hope for, using just these conditions (nonnegativity, boundedness, and second moment bound) on a variable, is a constant. (Indeed, applying a basic argument, we find that this is the smallest expectation possible.)

This suggests that we need a little more control over the tails of |X|, which gets us to...

Attempt #2 (and solution)

Another easy quantity to compute in this case is 𝐄[X4]. (And, really, any even power of X is easy. On the other hand, since X has a distribution that is symmetric around 0, all odd moments are 0.) Splitting the sum out into each of the possible quartic terms, we find that any term containing an odd power of Xi will be zero in expectation as the Xi are independent. So, we find

𝐄[X4]=i𝐄[Xi4]+ij𝐄[Xi2Xj2]=n+3n(n1)3n2.

This quantity will come in handy soon.

We can, on the other hand, split up the expectation of X2 in a variety of ways. One is particularly handy to get a tail lower bound like the one we wanted in our proof idea (above):

𝐄[X2]=𝐄[X21|X|<a]+𝐄[X21|X|a]a2+𝐄[X21|X|a].

The latter term can be upper bounded using Cauchy--Schwarz,1

𝐄[X21|X|a]𝐄[X4]𝐄[1|X|a].

(Since 1|X|a2=1|X|a.) And, since 𝐄[1|X|a]=Pr(|X|a), we finally have:

𝐄[X2]a2+𝐄[X4]Pr(|X|a).

Rearranging gives us the desired lower bound,

Pr(|X|a)(𝐄[X2]a2)2𝐄[X4].

(This is a Paley--Zygmund-style bound, except over X2 rather than nonnegative X.)

Now, since we know that

𝐄[|X|]aPr(|X|a),

then we have

𝐄[|X|]a(𝐄[X2]a2)2𝐄[X4].

Parametrizing a by a=α𝐄[X2] for some 0α1, we then have

𝐄[|X|]α(1α2)2𝐄[X2]3/2𝐄[X4].

The right-hand-side is maximized at α=1/5, which gives the following lower bound

𝐄[|X|]16255𝐄[X2]3/2𝐄[X4].

And, finally, using the fact that 𝐄[X2]=n and 𝐄[X4]3n2, we get the final result:

𝐄[|X|]16755nΩ(n),

as required, with no need for combinatorics! Of course the factor of 16/(755).095 is rather weak compared to the factor of 2/π.80, but this is ok for our purposes.

General extensions

Of course, similar constructions also hold rather nicely for things like uniform [1,1] variables, or Normally distributed, mean zero variables. Any variable for which the second and fourth moment can be easily computed allows us to compute a lower bound on this expectation. (Expectations of the absolute value of the sums of independently drawn versions of these variables could be similarly computed.) These have no obvious combinatorial analogue, so those techinques cannot be easily generalized, whereas this bound applies immediately.

Thanks

A big thank you to George Lowther for pointing out that I was missing a factor of 3 in the fourth moment calculation in an earlier version of this post!

  1. Possibly the most elegant proof of Cauchy--Schwarz I know is based on minimizing a quadratic, and goes a little like this. Note that 𝐄[(XtY)2]0 for any t𝐑. (That this expectation exists can be shown for any t assuming both X and Y have finite second moment. If not, the inequality is also trivial.) Expanding gives 𝐄[X2]2t𝐄[XY]+t2𝐄[Y2]0. Minimizing the left hand side over t then shows that t=𝐄[XY]/𝐄[Y2], which gives 𝐄[X2]𝐄[XY]2𝐄[Y2]0. Multiplying both sides by 𝐄[Y2] gives the final result.