Smoothed Analysis of Algorithms
Add to Google Calendar
Theorists have long been challenged by the existence of remarkable algorithms that are known by scientists and engineers to work well in practice, but whose theoretical analyses have been negative or unconvincing. The root of the problem is that algorithms are usually analyzed in one of two ways: by worst-case or average-case analysis.
The former can improperly suggest that an algorithm will perform poorly, while the latter can be unconvincing because the random inputs it considers may fail to resemble those encountered in practice.
We present a new approach called smoothed analysis that can help explain the success of many algorithms that both worst-case and average-case analyses cannot. In smoothed analysis, we measure the performance of an algorithm under slight random perturbations of arbitrary inputs, and bound the performance as a function of the size of the input and the magnitude of the perturbation. If an algorithm has low smoothed complexity, then it should perform well if its inputs are subject to random noise.
We will explain how smoothed analysis has been used to analyze the behavior of many algorithms and heuristics, including the simplex algorithm, Gaussian elimination with partial pivoting, and k-means.
Daniel Alan Spielman was born in Philadelphia, PA, in 1970. He received B.A. degrees in Mathematics and Computer Science from Yale in 1992 and the Ph.D. degree in Applied Mathematics from the Massachusetts Institute of Technology in 1995.
He spent 1995-1996 in the Computer Science department at the University of California at Berkeley, where he was supported by an NSF Mathematical Sciences Postdoctoral Fellowship. From 1996 through 2005, he taught Applied Mathematics at the Massachusetts Institute of Technology, where he received tenure in 2004. Since 2005, he has been a professor of Applied Mathematics and Computer Science at Yale.
Prof. Spielman was awarded a CAREER award from the NSF in 1997 and a Research Fellowship from the Alfred P. Sloan Foundation in 1998. His Ph.D. dissertation won the Association for Computing Machinery's Doctoral Dissertation Award. He shared the 2002 Information Theory Society paper award for his work on error-correcting codes.
His current research interests include the design and analysis of algorithms, spectral graph theory, and combinatorial scientific computing.