Keywords: signal, classification, markov model, entropy, information divergence
Entropy estimates of the decision statistics of the classification algorithm for random processes
UDC 621.396
DOI: 10.26102/2310-6018/2020.31.4.034
The problem of identifying the state of a technical system based on signals coming from it is considered, each state corresponds to a class of signals with certain properties, which is relevant in the field of pattern recognition, technical diagnostics and other areas of science and technology. To solve it, the belonging of the incoming signal to one of the selected classes is determined. To describe a random signal and mathematical representation of classes, a Markov model of a random process is used, on the basis of which an optimal signal classification algorithm with a given reliability has been developed. Values (decisive statistics) are obtained, according to which a decision is made about the belonging of a sample of samples of the received signal to the corresponding class and which allow us to estimate the "distance" between the classes (their models). Their study allows one to evaluate the capabilities and efficiency of signal classification algorithms, as well as the properties of a set of classes by their Markov models. With the use of information theory, the properties of decisive statistics are investigated, their probabilistic characteristics are determined. Using the concepts of entropy and information divergence (Kullback-Leibler distance), estimates of the mean value and variance of the decision statistics are obtained. Estimates of the duration of the classification procedure are obtained. An example of calculation is given. The research results can be used to determine the state of technical devices (engines, turbines, etc.) based on incoming signals from sensors placed on them, when classifying radio signals in radio monitoring systems and other scientific and technical applications.
1. Kudryashov B.D. Teoriya informatsii. 2009;19-20.
2. Fukunaga K. Vvedenie v statisticheskuyu teoriyu raspoznavaniya obrazov. 1979;57-67.
3. J. Ven Raizin. Klassifikatsiya i klaster. 1980;7-18.
4. Morris H. DeGroot. Optimal Statistical Decisions. Willey Classics Library Edition Published. 200; 230-255.
5. Kazakov V.A. Vvedenie v teoriyu markovskih protsessov i nekotorye radiotehnicheskie zadachi. 1973;8-12.
6. Tikhonov V.I., Mironov M.A. Markovskie protsessy. 1977;69-75.
7. Kalinin M.Y. Osobennosti razrabotki programmy classifikatsii informatsionnyh signalov na osnove markovskoj modeli. 2018;2(3):48-57.
8. Chisar I., Kerner Y. Teoriya informatsii. 1985;35-45.
9. Kulbak S. Teoriya informatsii i statistika. 1967;95-98.
10. Kendall M. Teoriya raspredelenij. 1966;477-481.
11. Kendall M. Statisticheskie vyvody i svyazi. 1973;13-19.
Keywords: signal, classification, markov model, entropy, information divergence
For citation: Kalinin M.Y., Choporov O.N. Entropy estimates of the decision statistics of the classification algorithm for random processes. Modeling, Optimization and Information Technology. 2020;8(4). URL: https://moitvivt.ru/ru/journal/pdf?id=881 DOI: 10.26102/2310-6018/2020.31.4.034 (In Russ).
Published 31.12.2020