The graduating bits event at ITCS was held last Friday. You can see the list of all presenters and their slides on the webpage.
I enjoyed it and learned quite a bit, even if I was rather strict towards the poor presenters with the timer… Here is a summary of some of the talks based on my faulty memory and the slides.. I apologize in advance for the (probably numerous) errors and omissions, and encourage you to take a look at the web pages of the presenters for much better information.
Ron Rothblum talked about new results on proving that computation is correct, including some wonderful connections between cryptography and quantum information, as well as an exciting new result with his brother Guy and Omer Reingold showing that a (at the moment quantatively suboptimal) variant of the classical IP=PSPACE result with an efficient prover.
Jayadev Acharya talked about computational versions of classical questions in statistics such as hypothesis testing, and testing distributions for monotonicity and independence, in particular getting the first linear time algorithms for several classical tasks.
Mohsen Ghaffari talked about his work on distributed algorithms. He’s had a great number of impressive results but focused on his recent SODA 16 paper on distributed maximal independent set which is the first that matches the 2006 lower bound of Kuhn et al at some range of parameters.
Nima Haghpanah talked about his work on incentives and computation and in particular about a new result with Hartline showing that despite some daunting hardness results, in several natural settings, it is possible to efficiently maximize revenue even when users have multidimensional preferences (e.g. think of users of an Internet Service Provider where some care more about the price and others care more about the bandwidth).
Z. Steven Wu Gave an excellent short talk on differential privacy, a subject on which he had a number of recent results, both giving new differentially private mechanisms as well as using it for other applications in mechanism design and learning.
Clément Canonne had probably the most interesting set of slides I have seen. He gave a high level talk about his work in learning. property and distributional testing.
Aviad Rubinstein is a non graduating student that has done several interesting works on the complexity of problems arising in computational economics, but talked about his recent thought-proving work on the fundamental “Sparse PCA” problem, talking about the relation between approximation ratios and success on actual data and (apparently Berkeley inspired) “best case analysis”.
Laura Florescu couldn’t make it to the event but her slides describe some interesting results on the stochastic block model that has attracted much attention from people in statistics, learning, statistical physics, and more, as well as Rotor walks, that are a deterministic variant of some natural physical processes.
Bo Waggoner opened his talk with a joke that probably resonated with many of the students: when a graduating student asked what he’s working he says “last year I worked on analysis of Boolean functions and lower bounds, but this year I’m graduating, so my area is now data science”. He generally worked in algorithmic game theory and has recently studied markets for information as opposed to physical goods.
Ilya Volkovich spoke about the notorious polynomial identity testing (PIT) problem. Most of us run away in fear from it, but he actually managed to solve it for some restricted circuit classes, as well as show connections between it and other open problems.
Elad Haramaty talked about algebraic property testing. Here the canonical example is the low degree test (i.e., property testing of the Reed-Muller code) that was the technical heart of the original PCP Theorem, he’s had some results in this area and also gave generalizations of the Reed-Muller code that have various applications.
Alexander Golovnev talk was inspiring- apparently he missed the memo that general circuit lower bounds are too difficult and should not be attempted by anyone, let alone a student. He and his co authors have obtained the strongest known lower bound for general Boolean circuits and he talked about a program to obtain stronger bounds via improved constructions of randomness dispersers.
Finally Mark Bun missed Bo’s joke, and despite graduating, was not afraid to declare his love for lower bounds. Like Steven Wu, he too works on differential privacy (which for some reason is highly correlated with Star Wars imagery on one’s slides). He worked on the lower bound side, with works answering the fundamental question of whether we need to pay a price in accuracy or efficiency if we want privacy.
Thanks for posting this, hopefully this can help make the job market more efficient. Since I really wanted to make it but couldn’t let me write is a one paragraph summary here.
I wanted to talk about new algorithms for massively parallel computation (aka MapReduce) with focus on both computation and communication (http://grigory.us/#mapreduce); lp-testing — a new model for noisy data analysis (http://grigory.us/#lp-testing) and developing targeted alternatives to bulk data collection (http://grigory.us/#pnas-privacy, I believe Steven might have mentioned this too).
As a big fan of both SODA and ITCS I think an interesting open problem is whether it might be possible to colocate ITCS and SODA in the future to remove the unnecessary logistical overhead of traveling between the two.
Thank you Grigory! Everyone else is also invited to post here comments summarizing their research.