Entropy-metric estimation of the small data models with stochastic parameters

Entropy-metric estimation of the small data models with stochastic parameters

17 January 2024 | Viacheslav Kovtun, Torki Altameem, Mohammed Al-Maitah, Wojciech Kempa
The paper addresses the formalization of dependencies between datasets, particularly in the context of small data, by considering specific hypotheses about data properties. The main objective is to develop a procedure for calculating optimal estimates of probability density functions (PDFs) of parameters in linear and nonlinear dynamic and static small data models, which incorporate both controlled and noise-oriented input and output measurements. The methodology combines probability theory, mathematical statistics, information theory, evaluation theory, and stochastic mathematical programming methods. The core of the approach is based on maximizing information entropy over sets determined by a small number of censored measurements of "input" and "output" entities in the presence of noise. This approach is applied to both linear and nonlinear models, and the optimization problems are formulated as stochastic linear programming problems with probabilistic constraints. The paper also discusses the structural features of PDFs of controlled parameters and interferences in these models, and provides examples to demonstrate the functionality of the proposed mathematical apparatus. The results show that the estimates obtained using interval probabilities have higher conditional entropy maximum compared to those obtained using normalized probabilities, and the choice of a priori probabilities significantly affects the accuracy of parameter estimation.The paper addresses the formalization of dependencies between datasets, particularly in the context of small data, by considering specific hypotheses about data properties. The main objective is to develop a procedure for calculating optimal estimates of probability density functions (PDFs) of parameters in linear and nonlinear dynamic and static small data models, which incorporate both controlled and noise-oriented input and output measurements. The methodology combines probability theory, mathematical statistics, information theory, evaluation theory, and stochastic mathematical programming methods. The core of the approach is based on maximizing information entropy over sets determined by a small number of censored measurements of "input" and "output" entities in the presence of noise. This approach is applied to both linear and nonlinear models, and the optimization problems are formulated as stochastic linear programming problems with probabilistic constraints. The paper also discusses the structural features of PDFs of controlled parameters and interferences in these models, and provides examples to demonstrate the functionality of the proposed mathematical apparatus. The results show that the estimates obtained using interval probabilities have higher conditional entropy maximum compared to those obtained using normalized probabilities, and the choice of a priori probabilities significantly affects the accuracy of parameter estimation.
Reach us at info@study.space
[slides] Entropy-metric estimation of the small data models with stochastic parameters | StudySpace