Greedy approximation /

Saved in:
Bibliographic Details
Author / Creator:Temlyakov, Vladimir, 1953-
Imprint:Cambridge ; New York : Cambridge University Press, 2011.
Description:xiv, 418 p. : ill. ; 24 cm.
Language:English
Series:Cambridge monographs on applied and computational mathematics ; 20
Cambridge monographs on applied and computational mathematics ; 20.
Subject:
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/8518415
Hidden Bibliographic Details
ISBN:9781107003378 (hardback)
1107003377 (hardback)
Notes:Includes bibliographical references and index.
Summary:"This first book on greedy approximation gives a systematic presentation of the fundamental results. It also contains an introduction to two hot topics in numerical mathematics: learning theory and compressed sensing. Nonlinear approximation is becoming increasingly important, especially since two types are frequently employed in applications: adaptive methods are used in PDE solvers, while m-term approximation is used in image/signal/data processing, as well as in the design of neural networks. The fundamental question of nonlinear approximation is how to devise good constructive methods (algorithms) and recent results have established that greedy type algorithms may be the solution. The author has drawn on his own teaching experience to write a book ideally suited to graduate courses. The reader does not require a broad background to understand the material. Important open problems are included to give students and professionals alike ideas for further research"--
"An introduction to two hot topics in numerical mathematics: learning theory and compressed sensing. This book possesses features of both a survey paper and a textbook. The majority of results are given with proofs. However,some important results with technically involved proofs are presented without proof. We included proofs of the most important and typical results; and we tried to include those proofs which demonstrate different ideas and are based on different techniques. In this sense the book has a feature of a survey - it tries to cover broad material. On the other hand, we limit ourselves to a systematic treatment of a specific topic rather than trying to give an overview of all related topics. In this sense the book is close to a textbook. There are many papers on theoretical and computational aspects of greedy approximation, learning theory and compressed sensing. We have chosen to cover the mathematical foundations of greedy approximation, learning theory and compressed sensing. The book is addressed to researchers working in numerical mathematics, analysis, functional analysis and statistics. It quickly takes the reader from classical results to the frontier of the unknown, but is written at the level of a graduate course and does not require a broad background in order to understand the topics"--
Table of Contents:
  • Preface
  • 1. Greedy approximation with regard to bases
  • 1.1. Introduction
  • 1.2. Schauder bases in Banach spaces
  • 1.3. Greedy bases
  • 1.4. Quasi-greedy and almost greedy bases
  • 1.5. Weak Greedy Algorithms with respect to bases
  • 1.6. Thresholding and minimal systems
  • 1.7. Greedy approximation with respect to the trigonometric system
  • 1.8. Greedy-type bases; direct and inverse theorems
  • 1.9. Some further results
  • 1.10. Systems L p -equivalent to the Haar basis
  • 1.11. Open problems
  • 2. Greedy approximation with respect to dictionaries: Hilbert spaces
  • 2.1. Introduction
  • 2.2. Convergence
  • 2.3. Rate of convergence
  • 2.4. Greedy algorithms for systems that are not dictionaries
  • 2.5. Greedy approximation with respect to ¿-quasi-orthogonal dictionaries
  • 2.6. Lebesgue-type inequalities for greedy approximation
  • 2.7. Saturation property of greedy-type algorithms
  • 2.8. Some further remarks
  • 2.9. Open problems
  • 3. Entropy
  • 3.1. Introduction: definitions and some simple properties
  • 3.2. Finite dimensional spaces
  • 3.3. Trigonometric polynomials and volume estimates
  • 3.4. The function classes
  • 3.5. General inequalities
  • 3.6. Some further remarks
  • 3.7. Open problems
  • 4. Approximation in learning theory
  • 4.1. Introduction
  • 4.2. Some basic concepts of probability theory
  • 4.3. Improper function learning; upper estimates
  • 4.4. Proper function learning; upper estimates
  • 4.5. The lower estimates
  • 4.6. Application of greedy algorithms in learning theory
  • 5. Approximation in compressed sensing
  • 5.1. Introduction
  • 5.2. Equivalence of three approximation properties of the compressed sensing matrix
  • 5.3. Construction of a good matrix
  • 5.4. Dealing with noisy data
  • 5.5. First results on exact recovery of sparse signals; the Orthogonal Greedy Algorithm
  • 5.6. Exact recovery of sparse signals; the Subspace Pursuit Algorithm
  • 5.7. On the size of incoherent systems
  • 5.8. Restricted Isometry Property for random matrices
  • 5.9. Some further remarks
  • 5.10. Open problems
  • 6. Greedy approximation with respect to dictionaries: Banach spaces
  • 6.1. Introduction
  • 6.2. The Weak Chebyshev Greedy Algorithm
  • 6.3. Relaxation; co-convex approximation
  • 6.4. Free relaxation
  • 6.5. Fixed relaxation
  • 6.6. Thresholding algorithms
  • 6.7. Greedy expansions
  • 6.8. Relaxation; X-greedy algorithms
  • 6.9. Incoherent dictionaries and exact recovery
  • 6.10. Greedy algorithms with approximate evaluations and restricted search
  • 6.11. An application of greedy algorithms for the discrepancy estimates
  • 6.12. Open problems
  • References
  • Index