The celebrated theory of NP-Hardness classifies problems into easy (in P) and NP-hard (probably not in P). However, while the easy problems do have polynomial time algorithms, in many cases, a polynomial runtime like O(n2) or O(n3) can be too slow. Many fundamental "easy" problems have resisted decades of attempts at getting truly fast algorithms, and are typically solved by potentially inaccurate but fast heuristics in practice. Such problems include Sequence Alignment, Parsing, Distance Computation in Graphs, and so many other problems we encounter in basic Algorithms courses. Can we formally explain the lack of faster algorithms and justify the use of heuristics, even for “easy” problems?
In this talk, I will overview a rapidly growing body of work providing a theoretical framework for proving hardness for problems in P. Numerous surprising connections and combinatorial reductions have uncovered fascinating structure. We can now point at a small set of core problems, and say: For dozens of important problems in P there is no hope to get faster algorithms, until we know how to solve the core problems faster.
Amir Abboud is a Computer Science Ph.D. student at Stanford University, under the supervision of Virginia Vassilevska Williams. His dissertation is on understanding the computational complexity of basic and fundamental problems. He also works on topics such as Graph Sparsification and Distributed Computing. He has won Best Student Paper awards at STOC 2016, DISC 2016, and ITCS 2017. Previously, he obtained an M.Sc. degree at the Technion, and a B.Sc. degree at the University of Haifa.
Faculy Host: Bernhard Haeupler