Different areas of mathematics, and theoretical computer science, freely borrow tools and ideas from each other and these interactions, like barters, can make both parties richer. In fact it’s even better than barter: the objects being ideas, you don’t have to actually give them up in an exchange.

And so it is no surprise that differential privacy has used tools from several other fields, including complexity, cryptography, learning theory and high-dimensional geometry. Today I want to talk about a little giving back. A small interest payment, if you will. Below I will describe how differential privacy tools helped us resolve a question of Alon and Kalai.

**Discrepancy**

This particular instance deals with discrepancy theory, more specifically, with the Erdös Discrepancy problem (EDP)(polymath 5 being discussed here).

In 1932, Erdös conjectured:

**Conjecture**[Problem 9 here] For any constant

For any *Homogenous Arithmetic Progression* (HAP). The conjecture above says that for any red blue coloring of the [n], there is some HAP which has a lot more of red than blue (or vice versa).

In modern language, this is a question about *discrepancy* of HAPs. So let me define discrepancy first. Let

Thus the conjecture says that the discrepancy of HAPs over

This post deals with the related concept of Hereditary Discrepancy. The discrepancy can often be small by accident: even though the set system is complex enough to contain a high discrepancy set system in it, it can have discrepancy zero. The hereditary discrepancy measures the maximum discrepancy of any set system “contained” in

Formally, given a set system

Some examples. Let

**Privacy**

Given a vector *differentially private*, i.e. for

Thus if

Researchers have studied the question of designing good mechanisms for specific matrices

A particular set of queries of interest is the following. The coordinates of the vector

Subcubes defined by fixing *contingency table* queries to others. These queries being important from a statistical point of view, Kasiviswanathan, Rudelson, Smith and Ullman showed lower bounds on the amount of error any differentrially private mechanism must add, for any constant

**The connection**

So what has discrepancy got to do with privacy? Muthukrishnan and Nikolov showed that if

For the EDP, it is natural to ask how large the hereditary discrepancy is. Alon and Kalai show that it is

Alex Nikolov and I managed to show that this is not the case. Since hereditary discrepancy is, well, hereditary, a lower bound on the hereditary discrepancy of a submatrix is also a lower bound on the hereditary discrepancy of whole matrix. In the EDP matrix on

The first step is done as follows: associate each dimension

Lower bounds for private mechanisms for

However, given what we know about

Of course the EDP itself is wide open. Head on here to help settle the conjecture.

Many thanks to Alex Nikolov for his contribution to the content and the writing of this post.