Related to my last post on the FOCS 2023 conjectures track, the chairs now put together an FAQ about it. ACM SIGACT is soliciting nominations for several prizes: Knuth Prize by February 15, Distinguished service award by March 1, and Gödel prize by March 31. NSF is looking for a Program Director for the Algorithmic … Continue reading Theory announcements: Prizes, CFP, and more
Author: Boaz Barak
New in FOCS 2023: A conjectures track
Update 1/27: Amit, Shubhangi, and Thomas put together an FAQ about this. This year, FOCS 2023 will include something new: a Conjectures Track, separate from the Main Track. Submissions to the Main Track will be evaluated along similar lines as STOC/FOCS papers typically are, aiming to accept papers that obtain the very best results across … Continue reading New in FOCS 2023: A conjectures track
Memento and Large Language Models
[Mild spoilers for the 2000 film Memento. See this doc for the full ChatGPT transcripts. --Boaz] Leonard Shelbey, the protagonist of Christopher Nolan's film "Memento", suffers from anterograde amnesia. He remembers everything up to the time in which he was the victim of a violent attack, but cannot form new memories after that. He uses … Continue reading Memento and Large Language Models
AI will change the world, but won’t take it over by playing “3-dimensional chess”.
By Boaz Barak and Ben Edelman [Cross-posted on Lesswrong ; See also Boaz’s posts on longtermism and AGI via scaling , as well as other "philosophizing" posts. This post also puts us in Aaronson's "Reform AI Alignment" religion] [Disclaimer: Predictions are very hard, especially about the future. In fact, this is one of the points of this essay. Hence, … Continue reading AI will change the world, but won’t take it over by playing “3-dimensional chess”.
Postdocs at Harvard!
New: Kempner Fellows. A 3-year prestigious position with attractive terms for early-stage scientists interested in "fundamentally advancing our understanding of natural and artificial intelligence." The ML Foundations and theory groups at Harvard are looking for postdocs for the coming academic year. There are also several other positions at Harvard, including at the Harvard Data Science … Continue reading Postdocs at Harvard!
Swiss TCS winter school, CCC 2023 call for papers
[Guest post by David Steurer, both the speakers and the location seem amazing! --Boaz] The Swiss Winter School on Theoretical Computer Science (Jan 29 - Feb 3 2023, https://theory.epfl.ch/WinterSchool2023/) will be the second installment in a series of annual winter schools jointly organized by EPFL and ETH Zurich (the first installment happened in 2020).The goal of … Continue reading Swiss TCS winter school, CCC 2023 call for papers
Quick reminders: masters, postdocs, faculty, etc.
As we're getting closer to the season when undergraduate students are considering graduate school, and graduate students are considering the next steps such as postdoc or faculty positions, I wanted to remind people of two resources for such positions: the TCS jobs and crowd-sourced masters pages. The process and market for both graduate studies and … Continue reading Quick reminders: masters, postdocs, faculty, etc.
My friend, Scott Aaronson
This is a photo of my book shelf at the office. Ever since joining Harvard, I have been ordering copies of Quantum Computing Since Democritus on a regular basis. I often hand them out to bright students, curious about science, whom I want to expose to the beautiful connections between computer science, math, physics, and … Continue reading My friend, Scott Aaronson
Injecting some numbers into the AGI debate
[Yet another "philosophizing" post, but one with some actual numbers. See also this follow up. --Boaz] Recently there have been many debates on “artificial general intelligence” (AGI) and whether or not we are close to achieving it by scaling up our current AI systems. In this post, I’d like to make this debate a bit … Continue reading Injecting some numbers into the AGI debate
Teaching circuits as the first computational model
This fall, I am once again teaching Harvard's "Introduction to Theoretical Computer Science" course (CS 121). Like many "intro to TCS / intro to theory of computation" courses, Harvard's course used to be taught with Sipser's classic textbook. Sipser's book is indeed, for better or worse, a classic. It is extremely well-written and students like … Continue reading Teaching circuits as the first computational model