The Art of Reductions

In memory of Mihai  Pătraşcu

Written by Rasmus Pagh, Rina Panigrahy, Kunal Talwar and Udi Wieder

Reductions are arguably at the heart of complexity theory. They show that one problem is at least as hard as another. Finding reductions often requires creativity and considerable technical skill; some reductions seem as if they were pulled from thin air. The most difficult part however may be finding the ‘right’ problem to reduce from. At its best this approach distills a large body of ad hoc research, cuts through the technical details and shows that a wide collection of seemingly different problems share the same essence, their hardness lies in one simple well described problem. Mihai has had repeated success doing exactly this type of research. We briefly discuss two such examples.

The following results are taken from “Toward Polynomial Lower Bounds for Dynamic Problems”.

3SUM: the hard problem

In the 3SUM problem we are given an array A of n numbers and we have to decide whether there are $latex i,j,k \in [n]$ such that A[i]+A[j]=[k]. It is easy to show an algorithm that finds such a triplet (if it exists) in time \Theta (n^2). A widely believed conjecture is that this is essentially best possible. The conjectured hardness of 3SUM was used to show hardness of problems in computational geometry.

Dynamic Data Structures

Consider the problem of maintaining a graph G on N nodes under the operations of edge insertion, edge deletion and queries of the type “is there a path between u and v?”. It is generally accepted that we cannot have a solution with polylog time per operation. Proving this seems beyond our reach at this point. In this paper, Mihai showed that a fast data structure for this (and many more problems) would imply a fast algorithm for the 3SUM problem.

This result is striking, first because the 3SUM problem is a standard input/output problem and not a data structure problem. Second, the 3SUM problem is  algebraic in nature while the reachability problem is combinatorial.

Mihai proves this through a series of reductions, each of them beautiful in its own way. Let us sketch just the first of these, which demonstrates the originality of the approach. The goal is to reduce 3SUM to a problem called Convoluted 3SUM. Here the goal is to find just a pair of indices i,j such that A[i]+A[j]=A[i+j]. Note that Convoluted 3SUM has a lot of structure and has two ‘degrees of freedom’ so it is obviously solvable in O(n^2).  Since n^2 is believed to be essentially the lower bound too, the problem is more amenable to be the base for reductions. Indeed, subsequently in the paper Convoluted 3SUM is ‘hardwired’ into a graph problem.

The main idea of the reduction is to use linear hashing. Say the numbers in A come from a set S. Imagine we had an injective map h:S\rightarrow [n] such that for every x,y, ~ h(x) + h(y) = h(x+y). If that were the case we could simply place every element x\in S in A[h(x)], and now if x+y=z then A[h(x)] + A[h(y)]= A[h(z)] = A[h(x)+h(y)], and we’re done. The problem is that a linear injective mapping does not necessarily exist. Mihai ‘s observation is that a hash function suggested by Martin Dietzfelbinger is almost linear and almost injective. It is almost linear in the sense that h(x)+h(y) \in \{ h(x+y), h(x+y)-1\}. It is almost injective in the sense that the hashing does not have many collisions. Both problems could be solved by running multiple instances of Convoluted 3Sum that are set up cleverly so that they exhaustively search all possible triplets.

In his paper “Unifying the landscape of Cell Probe Lower bounds” Mihai shows a clever series of reductions from the Lopsided Set Disjointness  (carefully chosen to have the acronym LSD) problem to a large set of data structure problems, including static and dynamic versions of data structure problems  such as partial match, approximate near neighbor and range search. The LSD problem is the following communication complexity problem: Alice has a set S, Bob has set T and they want to determine whether or not the sets are disjoint. The lopsided nature refers to the setting where one of the sets is much larger than the other.

A central problem in this reduction landscape is a reachability problem on a special class of graphs: those that are subgraphs of butterfly graphs. Since in a butterfly graph, there is a unique path between any pair of nodes,  testing s-t reachability amounts to checking whether any of the edges on the unique s-t path are amongst the missing edges. Thus reachability on a subgraph of a butterfly is a special case of LSD. What Mihai shows is that it in fact captures the full complexity of the problem: the general LSD problem can be reduced to a small number of reachability queries in subgraphs of a butterfly graph. Thus we can derive a lower bound for reachability in these graphs.

This work unifies a large number of data structure lower bounds, from static lower bounds that show that query time must be large if the space is small, to dynamic data structure lower bounds where one shows that not both the update and the query times can be small. All known lower bounds fail when the query time is more than logarithmic . Going beyond polylogarithmic update + query time remains a grand challenge in this area. We remark that recent work by Kasper Green Larsen builds upon Mihai’s work to get the first \tilde{\Omega}(log^2 n) lower bound on the sum of the query and update times for dynamic range counting.

2 thoughts on “The Art of Reductions

  1. What a beautiful collection of results. Thanks for sharing.
    I’m at a Dagstuhl workshop, where many of us toasted his life and work tonight and shared memories. (A bit uncomfortably, since we were also celebrating Mike Fellows’ 60th birthday.)

    Rest in peace, Mihai. As a younger student at MIT, it was incredibly inspiring to watch your research visions develop. It’s so sad to see them cut short.

Leave a comment