Discrepancy and Rounding Linear Programs

In the previous posts we talked about various discrepancy questions and saw a proof of the six standard deviations suffice result. Besides being of interest in combinatorics, discrepancy theory has several remarkable applications in algorithms. Check this excellent book for a taste of these results. Here I will briefly discuss two (one old and one … Continue reading Discrepancy and Rounding Linear Programs

Discrepancy Bounds from Convex Geometry

In the last post we discussed some questions about discrepancy and the 'Six Standard Deviations Suffice' theorem stated below (without the $latex {6}&fg=000000$, which is not too important, but makes for a great title): Theorem 1 For vectors $latex {a^1,\ldots,a^n \in \{1,-1\}^n}&fg=000000$, there exists $latex {\epsilon \in \{1,-1\}^n}&fg=000000$ such that for every $latex {j \in … Continue reading Discrepancy Bounds from Convex Geometry

Differential Privacy for Measure Concentration

Today, we have a guest post from Frank McSherry talking about a clever approach to using Differential Privacy for handling pesky dependencies that get in the way of proving measure concentration results. --------------------- In this post I'll explain a cute use of differential privacy as a tool in probabilistic analysis. This is a great example … Continue reading Differential Privacy for Measure Concentration