Gerard Ben Arous, Surya Ganguli, Florent Krzakala and Lenka Zdeborova are organizing a summer school on statistical physics of machine learning on August 2-28, 2020 in Les Houches, France. If you don’t know Les Houches, it apparently looks like this:
They are looking for applications from students, postdocs, and young researchers in physics & math, as well computer scientists. While I am biased (I will be lecturing there too) I think the combination of lecturers, speakers, and audience members will yield a very unique opportunity for interaction across communities, and strongly encourage theoretical computer scientists to apply (which you can from the website). Let me also use this opportunity to remind people again of Tselil Schramm’s blog post where she collected some of the lecture notes from the seminar we ran on physics & computation.
More information about the summer school:
The “Les Houches school of physics”, situated close to Chamonix and the Mont Blanc in the French Alps, has a long history of forming generations of young researchers on the frontiers of their fields. Our school is aimed primarily at the growing audience of theoretical physicists and applied mathematicians interested in machine learning and high-dimensional data analysis, as well as to colleagues from other fields interested in this interface. [my emphasis –Boaz] We will cover basics and frontiers of high-dimensional statistics, machine learning, the theory of computing and learning, and probability theory. We will focus in particular on methods of statistical physics and their results in the context of current questions and theories related to machine learning and neural networks. The school will also cover examples of applications of machine learning methods in physics research, as well as other emerging applications of wide interest. Open questions and directions will be presented as well.
Students, postdocs and young researchers interested to participate in the event are invited to apply on the website http://leshouches2020.krzakala.org/ before March 15, 2020. The capacity of the school is limited, and due to this constraint participants will be selected from the applicants and participants will be required to attend the whole event.
- Boaz Barak (Harvard): Computational hardness perspectives
- Giulio Biroli (ENS, Paris): High-dimensional dynamics
- Michael Jordan (UC Berkeley): Optimization, diffusion & economics
- Marc Mézard (ENS, Paris): Message-Passing algorithms
- Yann LeCun (Facebook AI, NYU). Challenges and directions in machine learning
- Remi Monasson (ENS, Paris): Statistical physics or learning in neural networks
- Andrea Montanari (Stanford): High-dimensional statistics & neural networks
- Maria Schuld (Univ. KwaZulu Natal & Xanadu): Quantum machine learning
- Haim Sompolinsky (Harvard & Hebrew Univ.): Statistical mechanics of deep neural networks
- Nathan Srebro (TTI-Chicago): Optimization and implicit regularisation
- Miles Stoudenmire (Flatiron, NYC): Tensor network methods
- Pierre Vandergheynst (EPFL, Lausanne): Graph signal processing & neural networks
Invited Speakers (to be completed):
- Christian Borgs (UC Berkeley)
- Jennifer Chayes (UC Berkeley)
- Shirley Ho (Flatiron NYC)
- Levent Sagun (Facebook AI)