August 1988 | Chen, S., Billings, S.A. and Luo, W.
This paper presents a method for identifying nonlinear systems using orthogonal least squares algorithms. The approach combines structure determination and parameter estimation, which is particularly useful for multivariable discrete-time nonlinear stochastic systems that are linear in parameters. The method is based on the NARMAX (Nonlinear AutoRegressive Moving Average with eXogenous inputs) model, which can be linearized through polynomial expansion. The algorithm efficiently selects relevant terms from a large set of possible terms, ensuring a parsimonious model that captures the system dynamics.
The paper discusses various numerical methods for solving least squares problems, including methods based on the normal equation, orthogonal decomposition of the regression matrix, and singular value decomposition. It highlights the advantages and disadvantages of each approach, emphasizing the importance of avoiding ill-conditioning in the regression matrix, which can lead to inaccurate results.
The paper then focuses on orthogonal algorithms for subset model selection, specifically the Classical Gram-Schmidt (CGS), Modified Gram-Schmidt (MGS), and Householder transformation methods. These algorithms are used to select a subset of terms that contribute most to the model's accuracy, while minimizing redundancy. The CGS and MGS methods are compared, with MGS being noted for its numerical stability and accuracy. The Householder method is also discussed, offering a computationally efficient way to triangularize the matrix.
The paper further describes the application of these orthogonal algorithms to the identification of polynomial NARMAX models. It outlines an iterative scheme where prediction errors are used to refine the model parameters, allowing for the inclusion of noise terms in the model. This approach enables the identification of complex nonlinear systems by iteratively improving the model's accuracy.
The study concludes that orthogonal algorithms provide a robust and efficient method for nonlinear system identification, particularly in scenarios where the system structure is unknown. The algorithms are shown to be effective in selecting relevant terms and estimating parameters, leading to accurate and parsimonious models. The paper also emphasizes the importance of considering model complexity and performance when selecting terms, suggesting the use of statistical criteria such as Akaike's information criterion (AIC) to balance these factors. Overall, the orthogonal least squares method is presented as a powerful tool for nonlinear system identification, offering a balance between computational efficiency and model accuracy.This paper presents a method for identifying nonlinear systems using orthogonal least squares algorithms. The approach combines structure determination and parameter estimation, which is particularly useful for multivariable discrete-time nonlinear stochastic systems that are linear in parameters. The method is based on the NARMAX (Nonlinear AutoRegressive Moving Average with eXogenous inputs) model, which can be linearized through polynomial expansion. The algorithm efficiently selects relevant terms from a large set of possible terms, ensuring a parsimonious model that captures the system dynamics.
The paper discusses various numerical methods for solving least squares problems, including methods based on the normal equation, orthogonal decomposition of the regression matrix, and singular value decomposition. It highlights the advantages and disadvantages of each approach, emphasizing the importance of avoiding ill-conditioning in the regression matrix, which can lead to inaccurate results.
The paper then focuses on orthogonal algorithms for subset model selection, specifically the Classical Gram-Schmidt (CGS), Modified Gram-Schmidt (MGS), and Householder transformation methods. These algorithms are used to select a subset of terms that contribute most to the model's accuracy, while minimizing redundancy. The CGS and MGS methods are compared, with MGS being noted for its numerical stability and accuracy. The Householder method is also discussed, offering a computationally efficient way to triangularize the matrix.
The paper further describes the application of these orthogonal algorithms to the identification of polynomial NARMAX models. It outlines an iterative scheme where prediction errors are used to refine the model parameters, allowing for the inclusion of noise terms in the model. This approach enables the identification of complex nonlinear systems by iteratively improving the model's accuracy.
The study concludes that orthogonal algorithms provide a robust and efficient method for nonlinear system identification, particularly in scenarios where the system structure is unknown. The algorithms are shown to be effective in selecting relevant terms and estimating parameters, leading to accurate and parsimonious models. The paper also emphasizes the importance of considering model complexity and performance when selecting terms, suggesting the use of statistical criteria such as Akaike's information criterion (AIC) to balance these factors. Overall, the orthogonal least squares method is presented as a powerful tool for nonlinear system identification, offering a balance between computational efficiency and model accuracy.