June 21, 1995 | Jonas Sjöberg, Qinghua Zhang, Lennart Ljung, Albert Benveniste, Bernard Deylon, Pierre-Yves Glorennec, Håkan Hjalmarsson, and Anatoli Juditsky
This paper provides a unified overview of nonlinear black-box modeling in system identification. It discusses various approaches, including neural networks, radial basis networks, wavelet networks, hinging hyperplanes, and wavelet transform-based methods, as well as models based on fuzzy sets and rules. The paper emphasizes the importance of selecting appropriate model structures and considers the common features among different approaches. It highlights that nonlinear structures can be viewed as a concatenation of a mapping from observed data to a regression vector and a nonlinear mapping from the regressor space to the output space. The latter mapping is typically formed as a basis function expansion, with basis functions derived from a simple scalar function modified in terms of scale and location. The paper also discusses parameter estimation techniques, such as criterion minimization and two-step procedures, and addresses the challenge of dealing with a large number of parameters through regularization, shrinking, pruning, or regressor selection. The paper also covers the use of different regressors, including linear and nonlinear ones, and discusses the importance of choosing appropriate basis functions for modeling. It concludes with a discussion of model estimation, model quality, and the trade-off between bias and variance in model performance. The paper also highlights the importance of model structure flexibility and the role of parameters offered and used in model estimation.This paper provides a unified overview of nonlinear black-box modeling in system identification. It discusses various approaches, including neural networks, radial basis networks, wavelet networks, hinging hyperplanes, and wavelet transform-based methods, as well as models based on fuzzy sets and rules. The paper emphasizes the importance of selecting appropriate model structures and considers the common features among different approaches. It highlights that nonlinear structures can be viewed as a concatenation of a mapping from observed data to a regression vector and a nonlinear mapping from the regressor space to the output space. The latter mapping is typically formed as a basis function expansion, with basis functions derived from a simple scalar function modified in terms of scale and location. The paper also discusses parameter estimation techniques, such as criterion minimization and two-step procedures, and addresses the challenge of dealing with a large number of parameters through regularization, shrinking, pruning, or regressor selection. The paper also covers the use of different regressors, including linear and nonlinear ones, and discusses the importance of choosing appropriate basis functions for modeling. It concludes with a discussion of model estimation, model quality, and the trade-off between bias and variance in model performance. The paper also highlights the importance of model structure flexibility and the role of parameters offered and used in model estimation.