Restricted Invertiblity by Interlacing Polynomials

In this post, I am going to give you a simple, self-contained, and fruitful demonstration of a recently introduced proof technique called the method of interlacing families of polynomials, which was also mentioned in an earlier post. This method, which may be seen as an incarnation of the probabilistic method, is relevant in situations when … Continue reading Restricted Invertiblity by Interlacing Polynomials

FOCS 2014 submission server is now open

And is linked from the Call for Papers. (The url for the server is http://focs2014.cs.princeton.edu , though please do read the call for paper, and the advice linked below, before submitting.) We are continuing the FOCS 2013 experiment of less regulation and more responsibility on the paper formatting, so please do read my advice for authors before submitting. … Continue reading FOCS 2014 submission server is now open

STOC 2014 Call for Workshop and Tutorial Proposals

STOC 2014: Call for Workshop and Tutorial Proposals (also available on the conference website) Workshop and Tutorial Day: Saturday, May 31, 2014 Workshop and Tutorial Co-Chairs: Kunal Talwar and Chris Umans On Saturday, May 31, immediately preceding the main conference, SsTOC 2014 will hold a workshop-and-tutorials day. We invite groups of interested researchers to submit … Continue reading STOC 2014 Call for Workshop and Tutorial Proposals

Discrepancy and Rounding Linear Programs

In the previous posts we talked about various discrepancy questions and saw a proof of the six standard deviations suffice result. Besides being of interest in combinatorics, discrepancy theory has several remarkable applications in algorithms. Check this excellent book for a taste of these results. Here I will briefly discuss two (one old and one … Continue reading Discrepancy and Rounding Linear Programs

Discrepancy Bounds from Convex Geometry

In the last post we discussed some questions about discrepancy and the 'Six Standard Deviations Suffice' theorem stated below (without the $latex {6}&fg=000000$, which is not too important, but makes for a great title): Theorem 1 For vectors $latex {a^1,\ldots,a^n \in \{1,-1\}^n}&fg=000000$, there exists $latex {\epsilon \in \{1,-1\}^n}&fg=000000$ such that for every $latex {j \in … Continue reading Discrepancy Bounds from Convex Geometry

Differential Privacy for Measure Concentration

Today, we have a guest post from Frank McSherry talking about a clever approach to using Differential Privacy for handling pesky dependencies that get in the way of proving measure concentration results. --------------------- In this post I'll explain a cute use of differential privacy as a tool in probabilistic analysis. This is a great example … Continue reading Differential Privacy for Measure Concentration