Overview of total least squares methods

Ivan Markovsky, Sabine Van Huffel

Research output: Contribution to journalArticle

595 Citations (Scopus)

Abstract

We review the development and extensions of the classical total least-squares method and describe algorithms for its generalization to weighted and structured approximation problems. In the generic case, the classical total least-squares problem has a unique solution, which is given in analytic form in terms of the singular value decomposition of the data matrix. The weighted and structured total least-squares problems have no such analytic solution and are currently solved numerically by local optimization methods. We explain how special structure of the weight matrix and the data matrix can be exploited for efficient cost function and first derivative computation. This allows to obtain computationally efficient solution methods. The total least-squares family of methods has a wide range of applications in system theory, signal processing, and computer algebra. We describe the applications for deconvolution, linear prediction, and errors-in-variables system identification.
Original languageEnglish
Pages (from-to)2283-2302
Number of pages20
JournalSignal Processing
Volume87
Publication statusPublished - 1 Oct 2007

Keywords

  • total least squares
  • Orthogonal regression
  • Errors-in-variables model
  • Deconvolution
  • System identification

Fingerprint

Dive into the research topics of 'Overview of total least squares methods'. Together they form a unique fingerprint.

Cite this