When did Majority become the stablest?

The fireworks happening over at Ryan O’Donnell’s blog reminded me of something that always puzzles me: Is Majority the Stablest? Or the Unstablest?

Consider the class of all monotone, balanced Boolean functions f:\{-1,1\}^n \rightarrow \{-1,1\}. We define the noise sensitivity NS(f) of a function f as NS(f) = P_{x,e}[f(x) \neq f( x \cdot e)], where x \in \{-1,1\}^n is chosen uniformly whereas the noise e \in \{-1,1\}^n is sampled from a noise distribution N of our choice. Think of e as telling us which bits of x to flip. Two natural choices are:

  1. The \epsilon-noise distribution N(\epsilon) where each bit is flipped independently with probability \epsilon.
  2. The “bit-flip” distribution N' where we flip a single random bit. This behaves a lot like  N(\epsilon) where \epsilon = 1/n.

Say we’d like to find the monotone, balanced, Boolean function f which is least sensitive to noise. A bit of thought shows that the answer is any co-ordinate/dictator function x_i. So to rule out trivial answers, let us only allow functions where all variables are equally important. More precisely, functions should be invariant under a transitive group of  permutations on the variables. This is a bit of overkill, the right way to do this is to insist that no variable has too much influence.

The Majority function Maj(x_1,\ldots,x_n) = sgn(\sum_{i=1}^nx_i ) is pretty insensitive/stable under \epsilon-noise, one can show that NS(Maj) \leq \sqrt{\epsilon}.  In contrast, for constant \epsilon, many other functions will have sensitivity approaching 0.5.The Majority is Stablest Theorem due Mossel, O’Donnell and Oleskeiwicz shows Majority is indeed about as good as it gets, functions which are more stable need to have a distinguished variable (which we disallow).

Now consider the “bit-flip” distribution N'. Using the \epsilon = 1/n rule of thumb, one would expect the noise-stability of Majority to be around 1/\sqrt{n}. This is indeed the case, no surprises there. What is surprising is that this is as unstable as it gets for monotone functions! The proof of this is simple: we start with the observation that since f is monotone, E_x[f(x)x_i] captures the probability that flipping x_i changes f for a random x. To see why this is so, fix all the other bits except x_i. Now flipping x_i from -1 to +1 can result in two possible outcomes:

  1. It leaves f unchanged, in which case the contribution to E_x[f(x)x_i] is 0.
  2. It causes f to change from -1 to +1, in which case the contribution to E_x[f(x)x_i] is 2. By monotonicity, the reverse flip from +1 to -1 is not possible.

So now we have

NS(f) = \frac{1}{n}\sum_{i=1}^nE_x[f(x)x_i] = \frac{1}{n}E_x\left[f(x)\left(\sum_{i=1}^nx_i\right)\right]

Since f(x) \in \{-1,1\}, we have

f(x)\left(\sum_{i=1}^nx_i\right) \leq \left|\sum_{i=1}^nx_i\right|

with equality iff

f(x) = sgn\left(\sum_{i=1}^nx_i\right) = Maj(x_1,...,x_n)

So  among balanced, transitive, monotone functions, Majority is maximally unstable under N' and maximally stable under N(\epsilon). And in an intriguing twist, one way to prove the stability under N(\epsilon) is to reduce it to bounding the stability under N' as is done in the proof of Peres’ Theorem, which tells us that the \sqrt{\epsilon} bound is true for all LTFs.

Coming soon:

  1. If not Majority, then who?
  2. What lies behind Majority’s rapid descent into instability?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s