Seeing New Connections

September 18, 2011


Ingrid Daubechies, honored for �her multifaceted and enduring contributions to mathematics, science, and engineering, especially her fundamental work in the development of the foundations and applications of wavelets,� gave SIAM�s 2011 John von Neumann lecture in Vancouver, at ICIAM. Described in the citation as �a brilliant mathematical scientist who has opened new directions of research and new approaches to signal and image processing and data analysis,� Daubechies was also recognized (as those who attended the lecture would agree) as �a gifted communicator who has greatly facilitated the spread and appreciation of mathematical ideas, and a tireless scientific leader who both serves and inspires the mathematical community.� Photos by VisionPhoto.ca.
From the SIAM President
Nick Trefethen


A happy moment for me at ICIAM in July was the opportunity to introduce Ingrid Daubechies as SIAM's John von Neumann lecturer. People seemed to enjoy the story of how I first heard of Daubechies. It was back around 1987, at MIT, and I was waiting to use the Xerox machine in the copy room in Building 2. Gil Strang was ahead of me in line and he told me, You know, Nick, something important is happening and it's called wavelets. You should read the new paper on compactly supported orthogonal wavelets by a young physicist from Belgium.

That paper, "Orthonormal Bases of Compactly Supported Wavelets," now lists more than 6000 citations at Google Scholar. Since then, Daubechies has been a worldwide leader in the development of this field, and her Ten Lectures on Wavelets is perhaps SIAM's most influential book ever.

Her von Neumann lecture, called "Sparsity in Data Analysis and Computation," began with a discussion of the application of wavelets that many of us know best, namely image compression. Traditionally (and in the original JPEG standard), one compressed images by decomposing them into different wave numbers by a Fourier transform and then discarding certain low-order information. The trouble is, sines and cosines are global, unlike the features in many images, and often one can't throw away so much. Wavelets combine a separation of wavelengths with a separation of spatial positions and often do a better job. Daubechies outlined the fascinating phenomenon that although Fourier approaches are optimal among linear approximations, where you have to choose the basis in advance, they are not optimal among nonlinear approximations, where the basis is allowed to depend on the data.

The audience of close to 1000 loved her story of watching a soccer game on television with her family one day some months ago, finding herself bored till she noticed that the grass in the TV image had a scale-invariant structure she recognized. She exclaimed to her family, "They're using wavelet compression!"

Daubechies' lecture then turned from image compression to compressed sensing, which introduces the crucial new element of randomness. In compressed sensing, one samples a signal by a collection of random measurements, and the number of measurements is (relatively) not very large. It might seem that nothing could come from such measurements, and so it would be, except for the crucial assumption that the signal sought is sparse. Once sparseness is assumed, the random measurements turn out to be enough to reconstruct the signal with very high probability. The importance of these ideas was underlined by the fact that earlier the same day in Vancouver, the ICIAM Collatz Prize had been awarded to compressed sensing pioneer Emmanuel Cand�s.



Overcoming her reservations about discussing compressed sensing with Emmanuel Cand�s in the audience, Ingrid Daubechies presented the important ideas underlying the area in her John von Neumann lecture. Cand�s, she pointed out, had received the ICIAM Collatz Prize that morning for "his outstanding contributions to numerical solution of wave propagation problems and compressive sensing, as well as anisotropic extensions of wavelets." Shown here at the ICIAM opening ceremonies are Taketomo Mitsui (left), who presented the prize, and Cand�s.



What made Daubechies' lecture most remarkable for me was where she took the discussion next, turning to the last two words of her lecture title. Random sampling combined with sparsity, she pointed out, is closely related to a whole new set of algorithms that have transformed computer science in recent decades. Ideas like fast primality testing, zero-knowledge proof, and public-key encryption have at their heart a targeted use of randomness. Suppose that we do a certain random test on a 1000-digit number n and each time it passes the test, the chance that n is not prime is cut in half. After 20 successful experiments, we know that n is prime with a risk of error of just one in a million. This kind of cumulative random process is now used in probabilistic algorithms all across computing, and Daubechies, referring in particular to the Johnson�Lindenstrauss lemma, discussed how all of these methods rely on the nonlinear exploitation of sparsity.

I found myself thinking of a curious symmetry. To achieve randomness in science or technology, our best bet is exponentials. You can toss a coin, but the outcome isn't so random because it is sensitive only algebraically to the details of the throw. For truer randomness you need a chaotic system with exponential sensitivities, like a pinball machine or the Lorenz equations. Run such a system for a moment and your randomness might be 99%. If that's not enough, run it a little longer to get 99.99%. The point is that with each new step, your knowledge about the system shrinks by a constant factor, soon reaching zero for practical purposes.

And to achieve certainty, our best bet is exponentials again. At the level of fundamental physics, anything can happen because of quantum tunnelling. But some things "never" happen in practice, such as the radioactive decay of an iron-56 atom. Why? Because the frequency of quantum events shrinks exponentially with the width of a potential barrier. Thickening up that barrier in a physics experiment is like adding another level of error correction in an electronic circuit or taking another step of a random algorithm or making another compressed sensing measurement. With each new step, your uncertainty about the system shrinks by a constant factor, soon reaching zero for practical purposes.

Daubechies' stimulating talk left us all seeing new connections. The next issue of SIAM News will have a collection of articles touching other aspects of ICIAM 2011.


Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+