(Also available as a pdf file. Apologies for the many footnotes, feel free to skip them.)

Computational problems come in all different types and from all kinds of applications, arising from engineering as well the mathematical, natural, and social sciences, and involving abstractions such as graphs, strings, numbers, and more. The universe of potential algorithms is just as rich, and so a priori one would expect that the best algorithms for different problems would have all kinds of flavors and running times. However natural computational problems “observed in the wild” often display a curious dichotomy— either the running time of the fastest algorithm for the problem is some small polynomial in the input length (e.g., ${O(n)}$ or ${O(n^2)}$) or it is exponential (i.e., ${2^{\epsilon n}}$ for some constant ${\epsilon>0}$). Moreover, while indeed there is a great variety of efficient algorithms for those problems that admit them, there are some general principles such as convexity (i.e., the ability to make local improvements to suboptimal solutions or local extensions to partial ones) that seem to underly a large number of these algorithms.1

To be sure, none of these observations are universal laws. In fact there are theorems showing exceptions to such dichotomies: the Time Hierarchy Theorem says that for essentially any time-complexity function ${T(\cdot)}$ there is a problem whose fastest algorithm runs in time (essentially) ${T(n)}$. Also, Ladner’s Theorem says that, assuming P${\neq}$NP, there are problems that are neither in P nor are NP-complete. Moreover, there are some natural problems with apparent “intermediate complexity”. Perhaps the most well known example is the Integer Factoring problem mentioned below. Nevertheless, the phenomenon of dichotomy, and the related phenomenon of recurring algorithmic principles across many problems, seem far too prevalent to be just an accident, and it is these phenomena that are the topic of this essay. Read more…

tags:

FOCS 2013 registration is open  with an early registration deadline of October 4th 2013. In a few days, I hope, the program would be posted as well, but as the list of accepted papers indicates – its going to be very interesting!

One of my favorite  quotes is “the work of the righteous is done by others.” It comes in handy whenever someone does something I could/should have done, and especially when this someone executes it better than I would. Based on its first few posts, I have the feeling that Moritz Hardt’s new blog, Moody Rd, will  give me plenty of opportunities to feel righteous

On the last day of the conference I paid a visit to the friendly workshop next door – CHES (Cryptographic Hardware and Embedded Systems). It was all foreign to me – a different mix of people, posters outside the lecture hall. After this year’s CRYPTO compressed talks, 25 minutes felt like an eternity. Actual food during coffee breaks. Some things never change, though: Adi Shamir was sitting in the front row and asking pointed questions.

My main interest was in the invited talk by John Kelsey on SHA-3. A quick recap of the last few years in hash functions may be necessary here. On January 1, 2004 two hash functions were used in virtually all cryptographic applications – MD5 and SHA-1, both dating back to the 90s. The more recent SHA-2, which was substantially slower and more complex, was thought as somewhat of overkill. In less than two years MD5 was badly broken and SHA-1 seriously compromised in a series of papers by Chinese cryptanalysts led by Xiaoyun Wang. Although no attacks on SHA-2 had been announced, its structure was deemed uncomfortably close to those of the broken functions. To address these concerns, NIST launched a competition for the new hash function design to be eventually standardized as SHA-3. The competition ran from 2007 to 2012 and became the focal point of cryptanalytic community with a lot of interest and support from theoreticians.

The competition progressed through several rounds, and concluded in September 2012 with selection of Keccak. The winning design was proposed by a team of European cryptographers and based on innovative “sponge” construction. The hash function competition was explicitly modeled on the AES selection process that has given us the DES successor. Differences between the cipher standardized by NIST and the winner of the competition were basically editorial – the standard omitted some of the modes of the winning submission, and its name was changed from distinctly Dutch Rijndael to an abbreviation that only bureaucrats can love – AES.

The speaker today was John Kelsey of NIST, who introduces himself using the nine most terrifying words in English – “I’m from the government, and I’m here to help”. After going through the history of the SHA-3 competition and explaining some rationale behind the NIST decision (all last year news), he dropped a bombshell, announcing that the design chosen for standardization differs from the winning entry in a significant structural change. Although the change appears to be within parameters endorsed by the Keccak designers, it was not part of the “official” entry. NIST is considering narrowing the internal state of the hash function to improve its performance without seriously compromising its security (whether it is actually the case is open for debate).

More concretely, the call for submissions required the candidates to offer at least $2^{n/2}$ security against collision-finding attacks, and $2^n$ second-preimage resistance (with some allowance for long messages). Since the minimum output length is at least 224 bits, the $2^{224}$ second-preimage resistance appears to be excessive security margin. Narrowing the internal state has the effect of collapsing security of the hash function against many attacks to “just” $2^{n/2}$. Whether this is a legitimate compromise (the construction becomes 30% faster) will be subject of lively discussions over the next few months.

With this, I conclude my series of posts from the CRYPTO 2013 conference. Let me know if you think it was an experiment worth repeating.

The highlights of the third day of CRYPTO were known ahead of time and yet did not disappoint: invited talk by Adam Langley on TLS, presentation of the best paper award, the business meeting followed by the beach barbecue.

Adam Langley, known for his work in Google on TLS and HTTPS, addressed the joint session of CRYPTO and CHES (Cryptographic Hardware and Embedded Systems workshop) with a talk “Why the web still runs on RC4”. A little background would be helpful here. A fast, simple, lightweight RC4 stream cipher was designed by Ron Rivest in the late 80s. It had had turbulent, eventful life, and in the peak of its popularity was the most common encryption mechanism in SSL/TLS. The young upstart—AES-CBC—was slowly gaining ground on it, and RC4 was seen to be nearing retirement from active service for several years now. However, several serious attacks against AES-CBC over the last two years made RC4 the cipher of choice again. It wouldn’t be so bad if RC4 offered rock-solid security but old and new attacks make it unsuitable for applications that support multiple encryptions of the same data.

In his talk that used refreshingly few slides (mostly for statistics and quotes), Adam Langley gave an overview of how we ended up in this sorry state. In short (and it was a repeating theme in his talk)—the web is a mess and full of bugs. There are several dynamics in play here—new attacks come out, patches are often easy but they break existing servers that relied on the old behavior (most often due to some bugs). Since no browser commands a dominant market share, a frustrated client becomes the browser’s problem, not the server’s. In a severe case of collective inaction, no browser is willing to apply the patches unilaterally, giving no incentives to the servers to update their software. There was, of course, more to the talk than just this observation. The talk will be available online and should be fun to watch.

The best paper award went to the paper by Faruk Gologlu, Robert Granger, Gary McGuire and Jens Zumbragel from University College, Dublin (Ireland). The paper broke records in solving the discrete logarithm problem in fields of small characteristic. They built upon Antoine Joux’s earlier work, and he and his co-authors improved upon Gologlu et al. just two months ago. The most important question, of course, is whether these methods that are applicable only to finite fields with very specific properties may be extended or adapted to prime field. Real men ask whether factoring would become easy. Although it is true that progress in discrete logarithm and factoring proceeded in lockstep, crossing the chasm from the fields of small characteristic to the RSA problem seems to be rather difficult (but, given the number of surprises at this CRYPTO, I would never say impossible).

The standard feature of Wednesday at CRYPTO is a business meeting. After taking care of official business of IACR (true to the name of the event), the current president of the association, Bart Preneel, explained rationale behind launching a discussion of rethinking how and where we publish. He stated several goals or principles, such as the end result should motivate submission to more journals, with higher impact ratings, and open access to all of papers published by IACR. Spirited discussion ensued that (in my estimate) changed no one’s opinion. I came away with the impression that the board of the association will formulate questions that the total membership will vote on.

First, I’d like to thank my co-authors (Omkant Pandey and Ananth Raghunathan) for preparing and delivering two excellent talks this morning. It is due to their decision to step up to the plate that I get to do the fun stuff like blogging from the conference.

The big topic of the afternoon session were multilinear maps. Even writing this gives me goosebumps—just a year ago I would doubt they existed, let alone be subject of three different papers at CRYPTO 2013 with non-overlapping sets of authors. I hope that we’ll cover the subject in more details some day, but suffices to say that expectations are running high. If pairing-based cryptography has vastly enriched the set of tools available to cryptographers, the potential of multilinear maps is yet to be fully grasped. A first candidate proposal for multilinear maps due to Sanjam Garg, Craig Gentry, and Shai Halevi was published on eprint in October 2012, presented at Eurocrypt 2013 in May (winning the best paper award), and already generated a flurry of activity and exciting work.

The main event of the day, and some would say, of the entire conference is the rump session. Tradition has it that on Tuesday evening cryptographers gather around campfire assemble for a session of short talks, some humorous, some less so, well lubricated with wine and beer. Tonight’s event, run by Dan Bernstein with his iron fist and a loud squeaker, consisted of 28 talks, mostly no longer than five minutes. That included two musical numbers (plus an encore) featuring two guitars, a keyboard, and a backing vocal section. More serious talks were brief summaries of works that have recently been published or will appear in non-IACR conferences. Cryptanalytic results are always guaranteed attention (original collision-finding attacks on MD4 and MD5 were announced at the rump session of CRYPTO’04, to standing ovation). This year is no exception: several factoring results of weak RSA keys from Nadia Heninger, symmetric-key cryptanalysis from Adi Shamir, “RC4 is finally dead” out of Royal Holloway (University of London) plus Dan Bernstein.

In a more unusual rump session talk, Moti Yung came out with passionate defense of the current conference system, jumpstarting the conversation that is likely to consume most of the business meeting tomorrow. The fairly radical proposal is to transition to, roughly speaking, the “VLDB” format where the primary publication venue would be the Proceedings of IACR, with several conferences layered on top. Some on-line discussion is already ongoing, and it is an easy guess that more arguments will follow, culminating with an association-wide ballot measure. After that my crystal ball gets cloudy—even after talking to a fair number of people at the conference, I hesitate to guess which way the voting would go if it were held today. Usually, facing a dramatic change people opt for the status quo. Cryptographers, always a nonconformist bunch, just might go the other way.

First day of CRYPTO. Nothing happened… Just kidding. Actually, the first day was quite busy and eventful. It was headlined by the session on lattices and fully-homomorphic encryption. The one talk that I’d like to highlight was delivered by Craig Gentry (and co-authored by Amit Sahai and Brent Waters). It describes a particularly elegant and succinct scheme for fully-homomorphic encryption (FHE) based on the approximate eigenvector method whose underlying assumption is hardness of learning with errors problem. The previous record for conceptually simple FHE was held by Zvika Brakerski, covered on this blog in two posts, and the new scheme builds in large part upon Zvika’s work. Actually, in an opening quite unusual for an academic paper, it starts with a quote from one of these posts!

In the new scheme, to add and multiply encrypted plaintexts, one simply adds and multiplies the corresponding ciphertexts, followed by a simple post-processing step that requires no additional key – almost a dream FHE construction. Although conceptual clarity comes at a cost – the scheme is not very fast or space-efficient – it has excellent explanatory value, in addition to constructing the first new identity-based FHE scheme.

The first invited talk of this conference was given by Cindy Cohn, the legal director and the general counsel of EFF. The offer was apparently extended before Snowden’s leaks hit the Internets but the decision to invite an EFF speaker could not have been more appropriate even with the benefit of hindsight. Cindy Cohn reviewed the legal landscape of electronic surveillance and wiretapping in the US, and then proceeded to point out multiple ways in which the actual practice of Internet-wide data collection, according to the leaked documents, contradicted either the spirit or the letter of the law. She affirmed the active stance EFF is taking on these issues, and concluded with a call for a political solution, similar to the Church Committee created in response to abuses of the Nixon administration. She even named her preferred candidate to lead the commission – Senator Ron Wyden, D-OR. Although the talk mostly avoided fear-mongering and sensationalism, much of it was based on some educated interpretation of the leaked PowerPoint slides – not the most reliable of sources. (If there’s anything that truly shocked me about Snowden’s allegations it was the dismal design of the NSA slides. In comparison, the fatal slide deck that could have prevented – but hadn’t – the Columbia disaster, looked quite tame and readable.)